diff --git a/README-ZH.md b/README-ZH.md index bcfcfb709..6a1d9cd14 100644 --- a/README-ZH.md +++ b/README-ZH.md @@ -7,7 +7,7 @@ ## 引言 -DataSphere Studio(简称DSS)是微众银行自研的一站式数据应用开发管理门户。 +DataSphere Studio(简称DSS)是微众银行自研的数据应用开发管理集成框架。 基于插拔式的集成框架设计,及计算中间件 [**Linkis**](https://github.com/WeBankFinTech/Linkis) ,可轻松接入上层各种数据应用系统,让数据开发变得简洁又易用。 @@ -37,170 +37,132 @@ DSS主要特点: 3、数据质量管理工具——[Qualitis](https://github.com/WeBankFinTech/Qualitis) - 4、工作流调度工具——[Azkaban](https://azkaban.github.io/) + 4、工作流调度工具——[Schedulis](https://github.com/WeBankFinTech/Schedulis) - **DSS插拔式的框架设计模式,允许用户快速替换DSS已集成的各个Web系统**。如:将Scriptis替换成Zeppelin,将Azkaban替换成DolphinScheduler。 + 5、数据交换工具——[Exchangis](https://github.com/WeBankFinTech/Exchangis) (**已支持免密跳转,等待Exchangis发版**) + + 6、数据Api服务——[DataApiService](https://github.com/WeBankFinTech/DataSphereStudio-Doc/blob/main/zh_CN/%E4%BD%BF%E7%94%A8%E6%96%87%E6%A1%A3/DataApiService%E4%BD%BF%E7%94%A8%E6%96%87%E6%A1%A3.md) + + 7、流式应用开发管理工具——[Streamis](https://github.com/WeBankFinTech/Streamis)(**即将开源**) + + **DSS插拔式的框架设计模式,允许用户快速替换DSS已集成的各个Web系统**。如:将 Scriptis 替换成Zeppelin,将 Schedulis 替换成DolphinScheduler。 ![DSS一站式](images/zh_CN/readme/onestop.gif) -### 二、基于Linkis计算中间件,打造独有的AppJoint设计理念 +### 二、基于Linkis计算中间件,打造独有的AppConn设计理念 - AppJoint,是DSS可以简单快速集成各种上层Web系统的核心概念。 + AppConn,是DSS可以简单快速集成各种上层Web系统的核心概念。 - AppJoint——应用关节,定义了一套统一的前后台接入规范,可让外部数据应用系统快速简单地接入,成为DSS数据应用开发中的一环。 + AppConn——应用连接器,定义了一套统一的前后台接入协议,总共分为三级规范,可让外部数据应用系统快速简单地接入,成为DSS数据应用开发中的一环。 + + AppConn的三级规范即:一级SSO规范,二级组织结构规范,三级开发流程规范; - DSS通过串联多个AppJoint,编排成一条支持实时执行和定时调度的工作流,用户只需简单拖拽即可完成数据应用的全流程开发。 + DSS通过串联多个 AppConn,编排成一条支持实时执行和定时调度的工作流,用户只需简单拖拽即可完成数据应用的全流程开发。 - 由于AppJoint对接了Linkis,外部数据应用系统因此具备了资源管控、并发限流、用户资源管理等能力,且允许上下文信息跨系统级共享,彻底告别应用孤岛。 + 由于 AppConn 对接了Linkis,外部数据应用系统因此具备了资源管控、并发限流、用户资源管理等能力,且允许上下文信息跨系统级共享,彻底告别应用孤岛。 -### 三、Project级管理单元 +### 三、Workspace级管理单元 - 以Project为管理单元,组织和管理各数据应用系统的业务应用,定义了一套跨数据应用系统的项目协同开发通用标准。 + 以 Workspace 为管理单元,组织和管理各数据应用系统的业务应用,定义了一套跨数据应用系统的工作空间协同开发通用标准,并提供了用户角色管理能力。 ### 四、已集成的数据应用组件 - DSS通过实现多个AppJoint,已集成了丰富多样的各种上层数据应用系统,基本可满足用户的数据开发需求。 - - **用户如果有需要,也可以轻松集成新的数据应用系统,以替换或丰富DSS的数据应用开发流程。** - - 1、DSS的调度能力——Azkaban AppJoint - - 用户的很多数据应用,通常希望具备周期性的调度能力。 - - 目前市面上已有的开源调度系统,与上层的其他数据应用系统整合度低,且难以融通。 - - DSS通过实现Azkaban AppJoint,允许用户将一个编排好的工作流,一键发布到Azkaban中进行定时调度。 - - DSS还为调度系统定义了一套标准且通用的DSS工作流解析发布规范,让其他调度系统可以轻松与DSS实现低成本对接。 - -![Azkaban](images/zh_CN/readme/Azkaban_AppJoint.gif) - - 2、数据开发——Scriptis AppJoint - - 什么是[Scriptis](https://github.com/WeBankFinTech/Scriptis)? - - Scriptis是一款支持在线写SQL、Pyspark、HiveQL等脚本,提交给[Linkis](https://github.com/WeBankFinTech/Linkis)执行的数据分析Web工具,且支持UDF、函数、资源管控和智能诊断等企业级特性。 - - Scriptis AppJoint为DSS集成了Scriptis的数据开发能力,并允许Scriptis的各种脚本类型,作为DSS工作流的节点,参与到应用开发的流程中。 - - 目前已支持HiveSQL、SparkSQL、Pyspark、Scala等脚本节点类型。 - -![Scriptis](images/zh_CN/readme/Scriptis_AppJoint.gif) - - 3、数据可视化——Visualis AppJoint - - 什么是Visualis? - - Visualis是一个基于宜信开源项目Davinci二次开发的数据可视化BI工具,为用户在数据安全和权限方面,提供金融级数据可视化能力。 - - Visualis AppJoint为DSS集成了Visualis的数据可视化能力,并允许数据大屏和仪表盘,作为DSS工作流的节点,与上游的数据集市关联起来。 - -![Visualis](images/zh_CN/readme/Visualis_AppJoint.gif) - - 4、数据质量——Qualitis AppJoint - - Qualitis AppJoint 为DSS集成数据质量校验能力,将数据质量系统集成到DSS工作流开发中,对数据完整性、正确性等进行校验。 - -![Qualitis](images/zh_CN/readme/Qualitis_AppJoint.gif) - - 5、数据发送——Sender AppJoint - - Sender AppJoint为DSS集成数据发送能力,目前支持SendEmail节点类型,所有其他节点的结果集,都可以通过邮件发送。 - - 例如:SendEmail节点可直接将Display数据大屏作为邮件发送出来。 - - 6、信号节点——Signal AppJoint - - EventChecker AppJoint用于强化业务与流程之间的解耦和相互关联。 - - DataChecker节点:检查库表分区是否存在。 - - EventSender: 跨工作流和工程的消息发送节点。 - - EventReceiver: 跨工作流和工程的消息接收节点。 - - 7、功能节点 - - 空节点、子工作流节点。 - - 8、**节点扩展** - - **根据需要,用户可以简单快速替换DSS已集成的各种功能组件,或新增功能组件。** + DSS通过实现多个AppConn,已集成了丰富多样的各种上层数据应用系统,基本可满足用户的数据开发需求。 + + **如果有需要,也可以轻松集成新的数据应用系统,以替换或丰富DSS的数据应用开发流程。** [点我了解如何快速集成新的应用系统](https://github.com/WeBankFinTech/DataSphereStudio-Doc/blob/main/zh_CN/%E5%BC%80%E5%8F%91%E6%96%87%E6%A1%A3/%E7%AC%AC%E4%B8%89%E6%96%B9%E7%B3%BB%E7%BB%9F%E6%8E%A5%E5%85%A5DSS%E5%BC%80%E5%8F%91%E6%8C%87%E5%8D%97.md) + +| 应用工具 | 描述 | DSS0.X 版本要求 | DSS1.0 版本要求 | 版本规划 | +| --------------- | -------------------------------------------------------------------- | --------------------------------------------------------------------- | ---------- | ------ | +| **DataApiService** | 数据API服务。可快速将SQL脚本发布为一个Restful接口,对外提供Rest访问能力 | 不支持 | >=1.0.0 | 已发布 | +| **Airflow** | 支持将DSS工作流发布到Airflow进行定时调度 | >=0.9.1,尚未合并 | on going | **待规划** | +| **Streamis** | 流式应用开发管理工具。支持发布Flink Jar 和 Flink SQL ,提供流式应用的开发调试和生产管理能力,如:启停、状态监控、checkpoint等。 | 不支持 | >=1.0.0 | **即将发布** | +| **UserManager** | 自动初始化一个DSS新用户所必须的所有用户环境,包含:创建Linux用户、各种用户路径、目录授权等 | >=0.9.1 | on going | **待规划** | +| **EventCheck** | 提供跨业务、跨工程和跨工作流的信号通信能力。 | >=0.5.0 | >=1.0.0 | 已发布 | +| **SendEmail** | 提供数据发送能力,所有其他工作流节点的结果集,都可以通过邮件进行发送 | >=0.5.0 | >=1.0.0 | 已发布 | +| [**Scriptis**](https://github.com/WeBankFinTech/Scriptis) | 支持在线写SQL、Pyspark、HiveQL等脚本,提交给[Linkis](https://github.com/WeBankFinTech/Linkis)执行的数据分析Web工具。 | >=0.5.0 | >=1.0.0 | 已发布 | +| [**Visualis**](https://github.com/WeBankFinTech/Visualis) | 基于宜信开源项目Davinci二次开发的数据可视化BI工具,为用户在数据安全方面提供金融级数据可视化能力。 | >=0.5.0 | >=1.0.0 | 已发布 | +| [**Qualitis**](https://github.com/WeBankFinTech/Qualitis) | 数据质量校验工具,提供数据完整性、正确性等数据校验能力 | >=0.5.0 | >=1.0.0 | **待发布** | +| [**Schedulis**](https://github.com/WeBankFinTech/Schedulis) | 基于Azkaban二次开发的工作流任务调度系统,具备高性能,高可用和多租户资源隔离等金融级特性。 | >=0.5.0 | >=1.0.0 | 已发布 | +| [**Exchangis**](https://github.com/WeBankFinTech/Exchangis) | 支持对结构化及无结构化的异构数据源之间的数据传输的数据交换平台 | 不支持 | >=1.0.0 | **待发布** | + ## Demo试用环境 - 由于DataSphereStudio支持执行脚本风险较高,WeDataSphere Demo环境的隔离没有做完,考虑到大家都在咨询Demo环境,决定向社区先定向发放邀请码,接受企业和组织的试用申请。 + 由于 DataSphereStudio 支持执行脚本风险较高,WeDataSphere Demo环境的隔离没有做完,考虑到大家都在咨询Demo环境,决定向社区先定向发放邀请码,接受企业和组织的试用申请。 如果您想试用Demo环境,请加入DataSphere Studio社区用户群(**加群方式请翻到本文档末尾处**),联系团队成员获取邀请码。 - WeDataSphere Demo环境用户注册页面:https://sandbox.webank.com/wds/dss/#/register + DataSphereStudio Demo环境用户注册页面:[点我进入](https://www.ozone.space/wds/dss/#/register) - WeDataSphere Demo环境登录页面:https://sandbox.webank.com/wds/dss/ + DataSphereStudio Demo环境登录页面:[点我进入](https://www.ozone.space/wds/dss/#/login) - 我们会尽快解决环境隔离问题,争取早日向社区完全开放WeDataSphere Demo环境。 + **DataSphereStudio1.0 Demo环境将在近期开放,敬请期待**。 -## 与类似系统对比 +## 下载 - DSS是一个引领数据应用开发管理方向的开源项目,开源社区目前尚没有同类产品。 + 请前往 [DSS releases](https://github.com/WeBankFinTech/DataSphereStudio/releases) 页面下载 DSS 的已编译版本或源码包。 -## 使用场景 +## 编译和安装部署 - DataSphere Studio适用于以下场景: +请参照 [编译指引](https://github.com/WeBankFinTech/DataSphereStudio-Doc/blob/main/zh_CN/%E5%BC%80%E5%8F%91%E6%96%87%E6%A1%A3/DSS%E7%BC%96%E8%AF%91%E6%96%87%E6%A1%A3.md) 来编译 DSS 源码。 - 1. 正在筹建或初步具备大数据平台能力,但无任何数据应用工具的场景。 +请参考 [安装部署文档](https://github.com/WeBankFinTech/DataSphereStudio-Doc/blob/main/zh_CN/%E5%AE%89%E8%A3%85%E9%83%A8%E7%BD%B2/DSS%E5%8D%95%E6%9C%BA%E9%83%A8%E7%BD%B2%E6%96%87%E6%A1%A3.md) 来部署 DSS。 - 2. 已具备大数据基础平台能力,且仅有少数数据应用工具的场景。 +## 示例和使用指引 - 3. 已具备大数据基础平台能力,且拥有全部数据应用工具,但工具间尚未打通,用户使用隔离感强、学习成本高的场景。 +请到 [用户使用文档](https://github.com/WeBankFinTech/DataSphereStudio-Doc/blob/main/zh_CN/%E4%BD%BF%E7%94%A8%E6%96%87%E6%A1%A3/DSS%E7%94%A8%E6%88%B7%E6%89%8B%E5%86%8C.md) ,了解如何快速使用DSS。 - 4. 已具备大数据基础平台能力,且拥有全部数据应用工具,部分工具已实现对接,但尚未定义统一规范的场景。 +## 文档 -## 快速安装使用 +DSS1.0的完整文档列表,请参见 [DSS-Doc](https://github.com/WeBankFinTech/DataSphereStudio-Doc/tree/main/zh_CN) -点我进入[快速安装使用](docs/zh_CN/ch2/DSS_LINKIS_Quick_Install.md) +以下为 DSS 相关 AppConn 插件的安装指南: -## 架构 +- [DSS的Visualis AppConn插件安装指南](https://github.com/WeBankFinTech/DataSphereStudio-Doc/blob/main/zh_CN/%E5%AE%89%E8%A3%85%E9%83%A8%E7%BD%B2/VisualisAppConn%E6%8F%92%E4%BB%B6%E5%AE%89%E8%A3%85%E6%96%87%E6%A1%A3.md) -![DSS架构](images/zh_CN/readme/architecture.png) +- [DSS的Schedulis AppConn插件安装指南](https://github.com/WeBankFinTech/DataSphereStudio-Doc/blob/main/zh_CN/%E5%AE%89%E8%A3%85%E9%83%A8%E7%BD%B2/SchedulisAppConn%E6%8F%92%E4%BB%B6%E5%AE%89%E8%A3%85%E6%96%87%E6%A1%A3.md) -## 文档列表 +- [DSS的Qualitis AppConn插件安装指南](https://github.com/WeBankFinTech/DataSphereStudio-Doc/blob/main/zh_CN/%E5%AE%89%E8%A3%85%E9%83%A8%E7%BD%B2/QualitisAppConn%E6%8F%92%E4%BB%B6%E5%AE%89%E8%A3%85%E6%96%87%E6%A1%A3.md) -#### 1. 安装编译文档 +- [DSS的Exchangis AppConn插件安装指南](https://github.com/WeBankFinTech/DataSphereStudio-Doc/blob/main/zh_CN/%E5%AE%89%E8%A3%85%E9%83%A8%E7%BD%B2/ExchangisAppConn%E6%8F%92%E4%BB%B6%E5%AE%89%E8%A3%85%E6%96%87%E6%A1%A3.md) -[快速安装使用文档](docs/zh_CN/ch2/DSS_LINKIS_Quick_Install.md) -[**DSS安装常见问题列表**](docs/zh_CN/ch1/DSS安装常见问题列表.md) +## 架构 -[DSS编译文档](docs/zh_CN/ch1/DSS编译文档.md) +![DSS架构](images/zh_CN/readme/architecture.png) -#### 2. 使用文档 -[快速使用文档](docs/zh_CN/ch3/DataSphere_Studio_QuickStart.md) +## 使用场景 -[用户手册](docs/zh_CN/ch3/DSS_User_Manual.md) + DataSphere Studio适用于以下场景: -#### 3. AppJoint插件安装文档 + 1. 正在筹建或初步具备大数据平台能力,但无任何数据应用工具的场景。 -**以下为手动安装相关插件的指南,DSS一键安装【标准版】已自动安装了以下插件,可忽略。** + 2. 已具备大数据基础平台能力,且仅有少数数据应用工具的场景。 -[DSS的Azkaban AppJoint插件安装指南](docs/zh_CN/ch4/如何接入调度系统Azkaban.md) + 3. 已具备大数据基础平台能力,且拥有全部数据应用工具,但工具间尚未打通,用户使用隔离感强、学习成本高的场景。 -[DSS的Qualitis AppJoint插件安装指南](https://github.com/WeBankFinTech/Qualitis/blob/master/docs/zh_CN/ch1/%E6%8E%A5%E5%85%A5%E5%B7%A5%E4%BD%9C%E6%B5%81%E6%8C%87%E5%8D%97.md) + 4. 已具备大数据基础平台能力,且拥有全部数据应用工具,部分工具已实现对接,但尚未定义统一规范的场景。 -#### 4. 第三方系统如何接入文档 -[DSS如何快速集成第三方系统](docs/zh_CN/ch4/第三方系统接入DSS指南.md) +## 贡献 -#### 5. 架构文档 +我们非常欢迎和期待更多的贡献者参与共建 DSS, 不论是代码、文档,或是其他能够帮助到社区的贡献形式。 -[DSS工程发布到调度系统的架构设计](docs/zh_CN/ch4/DSS工程发布调度系统架构设计.md) +## 联系我们 -更多文档,敬请期待! +对 DSS 的任何问题和建议,敬请提交issue,以便跟踪处理和经验沉淀共享。 -## 交流贡献 +您也可以扫描下面的二维码,加入我们的微信/QQ群,以获得更快速的响应。 ![交流](images/zh_CN/readme/communication.png) +## 谁在使用 DSS + +我们创建了 [Who is using DSS](https://github.com/WeBankFinTech/DataSphereStudio/issues/1) issue 以便用户反馈和记录谁在使用 DSS,欢迎您注册登记. + +DSS 自2019年开源发布以来,累计已有700多家试验企业和1000+沙盒试验用户,涉及金融、电信、制造、互联网等多个行业。 + ## License DSS is under the Apache 2.0 license. See the [License](LICENSE) file for details. diff --git a/README.md b/README.md index a3bfcfe4e..f37d3639d 100644 --- a/README.md +++ b/README.md @@ -33,19 +33,21 @@ Please be patient, it will take some time to load gif. c. [Qualitis](https://github.com/WeBankFinTech/Qualitis) - Data Quality Management Tool - d. [Azkaban](https://azkaban.github.io/) - Batch workflow job scheduler + d. [Schedulis](https://github.com/WeBankFinTech/Schedulis) - Batch workflow job scheduler + + f. [Exchangis](https://github.com/WeBankFinTech/Exchangis) - Data Exchange Tool ![DSS one-stop video](images/en_US/readme/onestop.gif) -### 2. AppJoint, based on Linkis,defines a unique design concept +### 2. AppConn, based on Linkis,defines a unique design concept - AppJoint——application joint, defining unified front-end and back-end + AppConn——application connector, defining unified front-end and back-end integration specifications, can quickly and easily integrate with external data application systems, making them as part of DSS data application development. - DSS arranges multiple AppJoints in series to form a workflow that supports real-time execution and scheduled execution. Users can complete the entire process development of data applications with simple drag and drop operations. + DSS arranges multiple AppConns in series to form a workflow that supports real-time execution and scheduled execution. Users can complete the entire process development of data applications with simple drag and drop operations. - Since AppJoint is integrated with Linkis, the external data application system shares the capabilities of resource management, concurrent limiting, and high performance. AppJoint also allows sharable context across system level and completely gets away from application silos. + Since AppConn is integrated with Linkis, the external data application system shares the capabilities of resource management, concurrent limiting, and high performance. AppConn also allows sharable context across system level and completely gets away from application silos. ### 3. Project, as the management unit @@ -53,55 +55,55 @@ Please be patient, it will take some time to load gif. ### 4. Integrated data application components - a. Azkaban AppJoint —— Batch workflow job scheduler + a. Schedulis AppConn —— Batch workflow job scheduler Many data applications developed by users usually require periodic scheduling capability. At present, the open source scheduling system in the community is pretty unfriendly to integrate with other data application systems. - DSS implements Azkaban AppJoint, which allows users to publish DSS workflows to Azkaban for regular scheduling. + DSS implements Schedulis AppConn, which allows users to publish DSS workflows to Azkaban for regular scheduling. DSS also defines standard and generic workflow parsing and publishing specifications for scheduling systems, allowing other scheduling systems to easily achieve low-cost integration with DSS. ![Azkaban](images/en_US/readme/Azkaban_AppJoint.gif) - b. Scriptis AppJoint —— Data Development IDE Tool + b. Scriptis AppConn —— Data Development IDE Tool What is [Scriptis](https://github.com/WeBankFinTech/Scriptis)? Scriptis is for interactive data analysis with script development(SQL, Pyspark, HiveQL), task submission(Spark, Hive), UDF, function, resource management and intelligent diagnosis. - Scriptis AppJoint integrates the data development capabilities of Scriptis to DSS, and allows various script types of Scriptis to serve as nodes in the DSS workflow to participate in the application development process. + Scriptis AppConn integrates the data development capabilities of Scriptis to DSS, and allows various script types of Scriptis to serve as nodes in the DSS workflow to participate in the application development process. Currently supports HiveSQL, SparkSQL, Pyspark, Scala and other script node types. ![Scriptis](images/en_US/readme/Scriptis_AppJoint.gif) - c. Visualis AppJoint —— Data Visualization Tool + c. Visualis AppConn —— Data Visualization Tool What is [Visualis](https://github.com/WeBankFinTech/Visualis)? Visualis is a BI tool for data visualization. It provides financial-grade data visualization capabilities on the basis of data security and permissions, based on the open source project Davinci contributed by CreditEase. - Visualis AppJoint integrates data visualization capabilities to DSS, and allows displays and dashboards, as nodes of DSS workflows, to be associated with upstream data market. + Visualis AppConn integrates data visualization capabilities to DSS, and allows displays and dashboards, as nodes of DSS workflows, to be associated with upstream data market. ![Visualis](images/en_US/readme/Visualis_AppJoint.gif) - d. Qualitis AppJoint —— Data quality management Tool + d. Qualitis AppConn —— Data quality management Tool - Qualitis AppJoint integrates data quality verification capabilities for DSS, allows Qualitis as a node in DSS workflow + Qualitis AppConn integrates data quality verification capabilities for DSS, allows Qualitis as a node in DSS workflow ![Qualitis](images/en_US/readme/Qualitis_AppJoint.gif) - e. Data Sender——Sender AppJoint + e. Data Sender——Sender AppConn - Sender AppJoint provides data delivery capability for DSS. Currently it supports the SendEmail node type, and the result sets of all other nodes can be sent via email. + Sender AppConn provides data delivery capability for DSS. Currently it supports the SendEmail node type, and the result sets of all other nodes can be sent via email. For example, the SendEmail node can directly send the screen shot of a display as an email. - f. Signal AppJoint —— Signal Nodes + f. Signal AppConn —— Signal Nodes - Signal AppJoint is used to strengthen the correlation between business and process while keeping them decoupled. + Signal AppConn is used to strengthen the correlation between business and process while keeping them decoupled. DataChecker Node:Checks whether a table or partition exists. diff --git a/assembly/bin/appconn-install.sh b/assembly/bin/appconn-install.sh new file mode 100644 index 000000000..a35217b31 --- /dev/null +++ b/assembly/bin/appconn-install.sh @@ -0,0 +1,109 @@ +#!/bin/sh +#Actively load user env +source ~/.bashrc +shellDir=`dirname $0` +workDir=`cd ${shellDir}/..;pwd` + +SOURCE_ROOT=${workDir} + +#load config +source ${SOURCE_ROOT}/conf/config.sh +source ${SOURCE_ROOT}/conf/db.sh + +APPCONN_NAME='' +APPCONN_INSTALL_IP=127.0.0.1 +APPCONN_INSTALL_PORT=8088 + +#echo "Current path of init sql is ${DB_DML_PATH}" +LOCAL_IP="`ifconfig | grep 'inet' | grep -v '127.0.0.1' | cut -d: -f2 | awk '{ print $2}'`" + +function isSuccess(){ + if [ $? -ne 0 ]; then + echo "Failed to " + $1 + exit 1 + else + echo "Succeed to" + $1 + fi +} + +PROC_NAME=DSSProjectServerApplication +ProcNumber=`ps -ef |grep -w $PROC_NAME|grep -v grep|wc -l` +if [ $ProcNumber -le 0 ];then + echo "${PROC_NAME} is not running,Please check whether DSS is installed" + exit 1000 +else + echo "Begine to install appconn" +fi + +##choose install mysql mode +function initInstallAppConn() { + echo "Please select the type of installation component?" + echo " 1: schedulis" + echo " 2: visualis" + echo " 3:Your AppConn Name" + echo " 4:exit" + read -p "Please input the choice:" idx + if [[ '1' = "$idx" ]];then + APPCONN_NAME="schedulis" + elif [[ '2' = "$idx" ]];then + APPCONN_NAME="visualis" + elif [[ '4' = "$idx" ]];then + echo "no choice,exit!" + exit 1 + else + APPCONN_NAME=$idx + fi + echo "Current installation component is ${APPCONN_NAME}" + + echo "" + echo "If this machine(127.0.0.1) is installed, enter 1" + echo "For others, you need to enter a complete IP address." + read -p "Please enter the ip of appconn: " ip + APPCONN_INSTALL_IP=$ip + if [[ '1' = "$ip" ]];then + APPCONN_INSTALL_IP="127.0.0.1" + fi + echo "You input ip is ${APPCONN_INSTALL_IP}" + + echo "" + read -p "Please enter the port of appconn:" port + APPCONN_INSTALL_PORT=$port + echo "You input ip is ${APPCONN_INSTALL_PORT}" +} + +function replaceCommonIp() { + if [[ $APPCONN_INSTALL_IP == "127.0.0.1" ]] || [[ $APPCONN_INSTALL_IP == "0.0.0.0" ]];then + echo "APPCONN_INSTALL_IP is equals $APPCONN_INSTALL_IP, we will change it to ip address" + APPCONN_INSTALL_IP=$LOCAL_IP + fi +} + +##choose execute mysql mode +function executeSQL() { + TEMP_DB_DML_PATH=${SOURCE_ROOT}/dss-appconns/${APPCONN_NAME}/db + DB_DML_PATH=$TEMP_DB_DML_PATH/init_real.sql + cp -rf $TEMP_DB_DML_PATH/init.sql $DB_DML_PATH + sed -i "s/APPCONN_INSTALL_IP/$APPCONN_INSTALL_IP/g" $DB_DML_PATH + sed -i "s/APPCONN_INSTALL_PORT/$APPCONN_INSTALL_PORT/g" $DB_DML_PATH + sed -i "s#DSS_INSTALL_HOME_VAL#$DSS_INSTALL_HOME#g" $DB_DML_PATH + mysql -h$MYSQL_HOST -P$MYSQL_PORT -u$MYSQL_USER -p$MYSQL_PASSWORD -D$MYSQL_DB --default-character-set=utf8 -e "source $DB_DML_PATH" + isSuccess "source $DB_DML_PATH" + echo "the table update finished" +} + +echo "" +echo "step1:Initialize installation settings" +initInstallAppConn +echo "" + +echo "step2:replaceIp" +replaceCommonIp +echo "" + +echo "step3:update database" +executeSQL +echo "" + +echo "step4:refresh appconn load" +curl -H "Token-Code:BML-AUTH" -H "Token-User:hadoop" -X GET http://${GATEWAY_INSTALL_IP}:${GATEWAY_PORT}/api/rest_j/v1/dss/framework/project/appconn/${APPCONN_NAME}/load +echo "" \ No newline at end of file diff --git a/bin/checkEnv.sh b/assembly/bin/checkEnv.sh similarity index 97% rename from bin/checkEnv.sh rename to assembly/bin/checkEnv.sh index d51bd5ca2..39f546d26 100644 --- a/bin/checkEnv.sh +++ b/assembly/bin/checkEnv.sh @@ -36,8 +36,6 @@ echo "<-----start to check used cmd---->" need_cmd yum need_cmd java need_cmd mysql -need_cmd unzip -need_cmd expect need_cmd telnet need_cmd tar need_cmd sed diff --git a/assembly/bin/excecuteSQL.sh b/assembly/bin/excecuteSQL.sh new file mode 100644 index 000000000..8896cf3c0 --- /dev/null +++ b/assembly/bin/excecuteSQL.sh @@ -0,0 +1,102 @@ +#!/bin/sh + +function checkExternalServer(){ + echo "telnet check for your $SERVER_NAME, if you wait for a long time,may be your $SERVER_NAME does not prepared" + result=`echo -e "\n" | telnet $EXTERNAL_SERVER_IP $EXTERNAL_SERVER_PORT 2>/dev/null | grep Connected | wc -l` + if [ $result -eq 1 ]; then + echo "$SERVER_NAME is OK." + else + echo "$SERVER_NAME is Bad. You need to prepare the' $SERVER_NAME ' environment in advance" + exit 1 + fi +} + +## choose install mode +function chooseInstallMode() { + echo "Simple installation mode" + #check for Java + checkJava + #check for mysql + SERVER_NAME=MYSQL + EXTERNAL_SERVER_IP=$MYSQL_HOST + EXTERNAL_SERVER_PORT=$MYSQL_PORT + checkExternalServer +} + +##choose install mysql mode +function chooseInstallMySQLMode() { + echo "Do you want to clear Dss table information in the database?" + echo " 1: Do not execute table-building statements" + echo " 2: Dangerous! Clear all data and rebuild the tables." + echo "" + MYSQL_INSTALL_MODE=1 + read -p "Please input the choice:" idx + if [[ '2' = "$idx" ]];then + MYSQL_INSTALL_MODE=2 + echo "You chose Rebuild the table" + elif [[ '1' = "$idx" ]];then + MYSQL_INSTALL_MODE=1 + echo "You chose not execute table-building statements" + else + echo "no choice,exit!" + exit 1 + fi + + ##init db + if [[ '2' = "$MYSQL_INSTALL_MODE" ]];then + ENV_FLAG="dev" + DB_CONF_PATH=${workDir}/db + DB_DML_PATH=$DB_CONF_PATH/dss_dml_real.sql + replaceAppConnInstanceSQL + executeSQL + fi +} + +##choose execute mysql mode +function executeSQL() { + chooseInstallMode + + sed -i "s/GATEWAY_INSTALL_IP/$GATEWAY_INSTALL_IP/g" $DB_DML_PATH + sed -i "s/GATEWAY_PORT/$GATEWAY_PORT/g" $DB_DML_PATH + + sed -i "s#DSS_INSTALL_HOME_VAL#$DSS_INSTALL_HOME#g" $DB_DML_PATH + + mysql -h$MYSQL_HOST -P$MYSQL_PORT -u$MYSQL_USER -p$MYSQL_PASSWORD -D$MYSQL_DB --default-character-set=utf8 -e "source $DB_CONF_PATH/dss_ddl.sql" + isSuccess "source dss_ddl.sql" + mysql -h$MYSQL_HOST -P$MYSQL_PORT -u$MYSQL_USER -p$MYSQL_PASSWORD -D$MYSQL_DB --default-character-set=utf8 -e "source $DB_DML_PATH" + isSuccess "source dss_dml_real.sql" + echo "Rebuild the table" +} + +function replaceAppConnInstanceSQL() { + DB_DML_PATH=$DB_CONF_PATH/dss_dml_real.sql + cp -rf $DB_CONF_PATH/dss_dml.sql $DB_DML_PATH + sed -i "s#ORCHESTRATOR_IP#$DSS_FRAMEWORK_ORCHESTRATOR_SERVER_INSTALL_IP#g" $DB_DML_PATH + sed -i "s#ORCHESTRATOR_PORT#$DSS_FRAMEWORK_ORCHESTRATOR_SERVER_PORT#g" $DB_DML_PATH + + sed -i "s#WORKFLOW_IP#$DSS_WORKFLOW_SERVER_INSTALL_IP#g" $DB_DML_PATH + sed -i "s#WORKFLOW_PORT#$DSS_WORKFLOW_SERVER_PORT#g" $DB_DML_PATH + + sed -i "s#EVENTCHECKER_JDBC_URL#$EVENTCHECKER_JDBC_URL#g" $DB_DML_PATH + sed -i "s#EVENTCHECKER_JDBC_USERNAME#$EVENTCHECKER_JDBC_USERNAME#g" $DB_DML_PATH + sed -i "s#EVENTCHECKER_JDBC_PASSWORD#$EVENTCHECKER_JDBC_PASSWORD#g" $DB_DML_PATH + + sed -i "s#DATACHECKER_JOB_JDBC_URL#$DATACHECKER_JOB_JDBC_URL#g" $DB_DML_PATH + sed -i "s#DATACHECKER_JOB_JDBC_USERNAME#$DATACHECKER_JOB_JDBC_USERNAME#g" $DB_DML_PATH + sed -i "s#DATACHECKER_JOB_JDBC_PASSWORD#$DATACHECKER_JOB_JDBC_PASSWORD#g" $DB_DML_PATH + + sed -i "s#DATACHECKER_BDP_JDBC_URL#$DATACHECKER_BDP_JDBC_URL#g" $DB_DML_PATH + sed -i "s#DATACHECKER_BDP_JDBC_USERNAME#$DATACHECKER_BDP_JDBC_USERNAME#g" $DB_DML_PATH + sed -i "s#DATACHECKER_BDP_JDBC_PASSWORD#$DATACHECKER_BDP_JDBC_PASSWORD#g" $DB_DML_PATH + + sed -i "s#BDP_MASK_IP#127.0.0.1#g" $DB_DML_PATH + sed -i "s#BDP_MASK_PORT#8087#g" $DB_DML_PATH + + sed -i "s#EMAIL_HOST#${EMAIL_HOST}#g" $DB_DML_PATH + sed -i "s#EMAIL_PORT#${EMAIL_PORT}#g" $DB_DML_PATH + sed -i "s#EMAIL_USERNAME#${EMAIL_USERNAME}#g" $DB_DML_PATH + sed -i "s#EMAIL_PASSWORD#${EMAIL_PASSWORD}#g" $DB_DML_PATH + sed -i "s#EMAIL_PROTOCOL#${EMAIL_PROTOCOL}#g" $DB_DML_PATH +} + +chooseInstallMySQLMode diff --git a/assembly/bin/install.sh b/assembly/bin/install.sh new file mode 100644 index 000000000..b02b48c0a --- /dev/null +++ b/assembly/bin/install.sh @@ -0,0 +1,313 @@ +#!/bin/sh +#Actively load user env +if [ -f "~/.bashrc" ];then + echo "Warning! user bashrc file does not exist." +else + source ~/.bashrc +fi + +shellDir=`dirname $0` +workDir=`cd ${shellDir}/..;pwd` + +SERVER_IP="" +SERVER_HOME="" + +local_host="`hostname --fqdn`" +LOCAL_IP="`ifconfig | grep 'inet' | grep -v '127.0.0.1' | cut -d: -f2 | awk '{ print $2}'`" + +#To be compatible with MacOS and Linux +txt="" +if [[ "$OSTYPE" == "darwin"* ]]; then + txt="''" +elif [[ "$OSTYPE" == "linux-gnu" ]]; then + # linux + txt="" +elif [[ "$OSTYPE" == "cygwin" ]]; then + echo "dss not support Windows operating system" + exit 1 +elif [[ "$OSTYPE" == "msys" ]]; then + echo "dss not support Windows operating system" + exit 1 +elif [[ "$OSTYPE" == "win32" ]]; then + echo "dss not support Windows operating system" + exit 1 +elif [[ "$OSTYPE" == "freebsd"* ]]; then + txt="" +else + echo "Operating system unknown, please tell us(submit issue) for better service" + exit 1 +fi + +function isSuccess(){ + if [ $? -ne 0 ]; then + echo "Failed to " + $1 + exit 1 + else + echo "Succeed to" + $1 + fi +} + +function checkJava(){ + java -version + isSuccess "execute java --version" +} + +checkJava + +dos2unix ${workDir}/config/* +dos2unix ${workDir}/bin/* + +echo "step1:load config" +source ${workDir}/config/config.sh +source ${workDir}/config/db.sh + +DSS_FILE_PATH="$workDir/$DSS_FILE_NAME" +#dos2unix ${DSS_FILE_PATH}/sbin/* +#dos2unix ${DSS_FILE_PATH}/sbin/ext/* +if [ -z $DSS_FILE_NAME ]; then + echo "DSS_FILE_NAME is null " + exit 1 +fi + +function replaceCommonIp() { + if [ -z "$DSS_FRAMEWORK_PROJECT_SERVER_INSTALL_IP" ]; then + DSS_FRAMEWORK_PROJECT_SERVER_INSTALL_IP=$LOCAL_IP + fi + if [ -z "$DSS_FRAMEWORK_PROJECT_SERVER_PORT" ]; then + DSS_FRAMEWORK_PROJECT_SERVER_PORT=9002 + fi + + if [ -z "$DSS_FRAMEWORK_ORCHESTRATOR_SERVER_INSTALL_IP" ]; then + DSS_FRAMEWORK_ORCHESTRATOR_SERVER_INSTALL_IP=$LOCAL_IP + fi + if [ -z "$DSS_FRAMEWORK_ORCHESTRATOR_SERVER_PORT" ]; then + DSS_FRAMEWORK_ORCHESTRATOR_SERVER_PORT=9003 + fi + + if [ -z "$DSS_APISERVICE_SERVER_INSTALL_IP" ]; then + DSS_APISERVICE_SERVER_INSTALL_IP=$LOCAL_IP + fi + if [ -z "$DSS_APISERVICE_SERVER_PORT" ]; then + DSS_APISERVICE_SERVER_PORT=9004 + fi + + if [ -z "$DSS_WORKFLOW_SERVER_INSTALL_IP" ]; then + DSS_WORKFLOW_SERVER_INSTALL_IP=$LOCAL_IP + fi + if [ -z "$DSS_WORKFLOW_SERVER_PORT" ]; then + DSS_WORKFLOW_SERVER_PORT=9005 + fi + + if [ -z "$DSS_FLOW_EXECUTION_SERVER_INSTALL_IP" ]; then + DSS_FLOW_EXECUTION_SERVER_INSTALL_IP=$LOCAL_IP + fi + if [ -z "$DSS_FLOW_EXECUTION_SERVER_PORT" ]; then + DSS_FLOW_EXECUTION_SERVER_PORT=9006 + fi + + if [ -z "$DSS_DATAPIPE_SERVER_INSTALL_IP" ]; then + DSS_DATAPIPE_SERVER_INSTALL_IP=$LOCAL_IP + fi + if [ -z "$DSS_DATAPIPE_SERVER_PORT" ]; then + DSS_DATAPIPE_SERVER_PORT=9008 + fi + + if [[ $GATEWAY_INSTALL_IP == "127.0.0.1" ]] || [ -z "$GATEWAY_INSTALL_IP" ]; then + echo "GATEWAY_INSTALL_IP is equals $GATEWAY_INSTALL_IP ,we will change it to ip address" + GATEWAY_INSTALL_IP=$LOCAL_IP + fi + if [[ $EUREKA_INSTALL_IP == "127.0.0.1" ]] || [ -z "$EUREKA_INSTALL_IP" ]; then + echo "EUREKA_INSTALL_IP is equals $EUREKA_INSTALL_IP ,we will change it to ip address" + EUREKA_INSTALL_IP=$LOCAL_IP + fi +} +##提换真实的IP +replaceCommonIp + +EUREKA_URL=http://$EUREKA_INSTALL_IP:$EUREKA_PORT/eureka/ + +## excecute sql +source ${workDir}/bin/excecuteSQL.sh + +function changeCommonConf(){ + sed -i "s#defaultZone:.*#defaultZone: $EUREKA_URL#g" $CONF_APPLICATION_YML + sed -i "s#hostname:.*#hostname: $SERVER_IP#g" $CONF_APPLICATION_YML + sed -i "s#wds.linkis.server.mybatis.datasource.url.*#wds.linkis.server.mybatis.datasource.url=jdbc:mysql://${MYSQL_HOST}:${MYSQL_PORT}/${MYSQL_DB}?characterEncoding=UTF-8#g" $CONF_DSS_PROPERTIES + sed -i "s#wds.linkis.server.mybatis.datasource.username.*#wds.linkis.server.mybatis.datasource.username=$MYSQL_USER#g" $CONF_DSS_PROPERTIES + sed -i "s#wds.linkis.server.mybatis.datasource.password.*#***REMOVED***$MYSQL_PASSWORD#g" $CONF_DSS_PROPERTIES + sed -i "s#wds.linkis.gateway.ip.*#wds.linkis.gateway.ip=$GATEWAY_INSTALL_IP#g" $CONF_DSS_PROPERTIES + sed -i "s#wds.linkis.gateway.port.*#wds.linkis.gateway.port=$GATEWAY_PORT#g" $CONF_DSS_PROPERTIES + sed -i "s#wds.linkis.gateway.url.*#wds.linkis.gateway.url=http://$GATEWAY_INSTALL_IP:$GATEWAY_PORT/#g" $CONF_DSS_PROPERTIES + sed -i "s#wds.linkis.gateway.wtss.url.*#wds.linkis.gateway.wtss.url=http://$GATEWAY_INSTALL_IP:$GATEWAY_PORT/#g" $CONF_DSS_PROPERTIES +} + +##function start +function changeConf(){ + sed -i "s#spring.server.port=.*#spring.server.port=$SERVER_PORT#g" $CONF_SERVER_PROPERTIES + if [[ $SERVER_NAME == "dss-framework-orchestrator-server" ]] || [[ $SERVER_NAME == "dss-workflow-server" ]]; then + SERVER_FULL_NAME=$SERVER_NAME + SERVER_FULL_NAME=$SERVER_NAME-$ENV_FLAG + sed -i "s#spring.spring.application.name=.*#spring.spring.application.name=$SERVER_FULL_NAME#g" $CONF_SERVER_PROPERTIES + fi + sed -i "s#wds.dss.appconn.scheduler.project.store.dir.*#wds.dss.appconn.scheduler.project.store.dir=$WDS_SCHEDULER_PATH#g" $CONF_SERVER_PROPERTIES + isSuccess "subsitution $CONF_SERVER_PROPERTIES of $SERVER_NAME" +} +##function end + + +UPLOAD_PUBLIC_IPS="" +##function start +function uploadProjectFile(){ + if [[ $SERVER_IP == "127.0.0.1" ]]; then + SERVER_IP=$local_host + fi + #echo "$SERVER_NAME-step3:copy install package" + # upload project conf + # cp -rfp $SSH_PORT ${workDir}/config/{$SERVER_NAME}.properties $CONF_PATH + if [[ $UPLOAD_PUBLIC_IPS == *",${ENV_FLAG}-$SERVER_IP,"* ]]; then + return 0 + fi + cp -rfp ${DSS_FILE_PATH}/* $SERVER_HOME + cp -rfp ${workDir}/bin $SERVER_HOME + cp -rfp ${workDir}/config/* $SERVER_HOME/conf + sudo chown -R $deployUser:$deployUser $SERVER_HOME + UPLOAD_PUBLIC_IPS="$UPLOAD_PUBLIC_IPS,${ENV_FLAG}-$SERVER_IP," + changeCommonConf +# echo "UPLOAD_PUBLIC_IPS-->$UPLOAD_PUBLIC_IPS" +} + +##function start +function installPackage(){ + if [[ $SERVER_IP == "127.0.0.1" ]]; then + SERVER_IP=$local_host + fi + if [ -z $SERVER_NAME ]; then + echo "ERROR:SERVER_NAME is null " + exit 1 + fi + uploadProjectFile + # change configuration + changeConf +} + +function dssWebInstall(){ +if ! test -e ${LINKIS_DSS_HOME}/wedatasphere-dss-web*.zip; then + echo "**********Error: please put wedatasphere-dss-web-xxx.zip in ${LINKIS_DSS_HOME}! " + exit 1 +else + echo "Start to unzip dss web package." + unzip -d ${LINKIS_DSS_HOME}/web/ -o ${LINKIS_DSS_HOME}/wedatasphere-dss-web-*.zip > /dev/null 2>&1 + sed -i "s#linkis_url.*#linkis_url=${LINKIS_GATEWAY_URL}#g" ${LINKIS_DSS_HOME}/web/config.sh + isSuccess "Unzip dss web package to ${LINKIS_DSS_HOME}/web" +fi +} + +##Install dss projects +function installDssProject() { +# if [ "$DSS_INSTALL_HOME" != "" ] +# then +# rm -rf $DSS_INSTALL_HOME +# fi + echo "" + echo "-----------------DSS install start--------------------" + SERVER_HOME=$DSS_INSTALL_HOME + if [ "$SERVER_HOME" == "" ] + then + export SERVER_HOME=${workDir}/DssInstall + fi + if [ -d $SERVER_HOME ] && [ "$SERVER_HOME" != "$workDir" ]; then + rm -r $SERVER_HOME-bak + echo "mv $SERVER_HOME $SERVER_HOME-bak" + mv $SERVER_HOME $SERVER_HOME-bak + fi + echo "create dir SERVER_HOME: $SERVER_HOME" + sudo mkdir -p $SERVER_HOME;sudo chown -R $deployUser:$deployUser $SERVER_HOME + isSuccess "Create the dir of $SERVER_HOME" + + echo "" + SERVER_NAME=dss-framework-project-server + SERVER_IP=$DSS_FRAMEWORK_PROJECT_SERVER_INSTALL_IP + SERVER_PORT=$DSS_FRAMEWORK_PROJECT_SERVER_PORT + + UPLOAD_LIB_FILES=$DSS_FILE_PATH/lib/dss-framework/$SERVER_NAME + LIB_PATH=$SERVER_HOME/lib/dss-framework + LOG_PATH=$SERVER_HOME/logs/dss-framework/$SERVER_NAME + CONF_SERVER_PROPERTIES=$SERVER_HOME/conf/$SERVER_NAME.properties + CONF_DSS_PROPERTIES=$SERVER_HOME/conf/dss.properties + CONF_APPLICATION_YML=$SERVER_HOME/conf/application-dss.yml + ###install project-Server + installPackage + echo "" + + SERVER_NAME=dss-framework-orchestrator-server + SERVER_IP=$DSS_FRAMEWORK_ORCHESTRATOR_SERVER_INSTALL_IP + SERVER_PORT=$DSS_FRAMEWORK_ORCHESTRATOR_SERVER_PORT + UPLOAD_LIB_FILES=$DSS_FILE_PATH/lib/dss-framework/$SERVER_NAME + LIB_PATH=$SERVER_HOME/lib/dss-framework + LOG_PATH=$SERVER_HOME/logs/dss-framework/$SERVER_NAME + CONF_SERVER_PROPERTIES=$SERVER_HOME/conf/$SERVER_NAME.properties + CONF_DSS_PROPERTIES=$SERVER_HOME/conf/dss.properties + CONF_APPLICATION_YML=$SERVER_HOME/conf/application-dss.yml + ###install project-Server + installPackage + echo "" + + SERVER_NAME=dss-apiservice-server + SERVER_IP=$DSS_APISERVICE_SERVER_INSTALL_IP + SERVER_PORT=$DSS_APISERVICE_SERVER_PORT + UPLOAD_LIB_FILES=$DSS_FILE_PATH/lib/dss-apps/$SERVER_NAME + LIB_PATH=$SERVER_HOME/lib/dss-apps + LOG_PATH=$SERVER_HOME/logs/dss-apps/$SERVER_NAME + CONF_SERVER_PROPERTIES=$SERVER_HOME/conf/$SERVER_NAME.properties + CONF_DSS_PROPERTIES=$SERVER_HOME/conf/dss.properties + CONF_APPLICATION_YML=$SERVER_HOME/conf/application-dss.yml + ###install dss-apiservice-server + installPackage + echo "" + + SERVER_NAME=dss-datapipe-server + SERVER_IP=$DSS_DATAPIPE_SERVER_INSTALL_IP + SERVER_PORT=$DSS_DATAPIPE_SERVER_PORT + UPLOAD_LIB_FILES=$DSS_FILE_PATH/lib/dss-apps/$SERVER_NAME + LIB_PATH=$SERVER_HOME/lib/dss-apps + LOG_PATH=$SERVER_HOME/logs/dss-apps/$SERVER_NAME + CONF_SERVER_PROPERTIES=$SERVER_HOME/conf/$SERVER_NAME.properties + CONF_DSS_PROPERTIES=$SERVER_HOME/conf/dss.properties + CONF_APPLICATION_YML=$SERVER_HOME/conf/application-dss.yml + ###install dss-datapipe-server + installPackage + echo "" + + ##Flow execution Install + PACKAGE_DIR=dss + SERVER_NAME=dss-flow-execution-server + SERVER_IP=$DSS_FLOW_EXECUTION_SERVER_INSTALL_IP + SERVER_PORT=$DSS_FLOW_EXECUTION_SERVER_PORT + UPLOAD_LIB_FILES=$DSS_FILE_PATH/lib/dss-orchestrator/$SERVER_NAME + LIB_PATH=$SERVER_HOME/lib/dss-orchestrator + LOG_PATH=$SERVER_HOME/logs/dss-orchestrator/$SERVER_NAME + CONF_SERVER_PROPERTIES=$SERVER_HOME/conf/$SERVER_NAME.properties + CONF_DSS_PROPERTIES=$SERVER_HOME/conf/dss.properties + CONF_APPLICATION_YML=$SERVER_HOME/conf/application-dss.yml + ###Install flow execution + installPackage + echo "" + + SERVER_NAME=dss-workflow-server + SERVER_IP=$DSS_WORKFLOW_SERVER_INSTALL_IP + SERVER_PORT=$DSS_WORKFLOW_SERVER_PORT + UPLOAD_LIB_FILES=$DSS_FILE_PATH/lib/dss-orchestrator/$SERVER_NAME + LIB_PATH=$SERVER_HOME/lib/dss-orchestrator + LOG_PATH=$SERVER_HOME/logs/dss-orchestrator/$SERVER_NAME + CONF_SERVER_PROPERTIES=$SERVER_HOME/conf/$SERVER_NAME.properties + CONF_DSS_PROPERTIES=$SERVER_HOME/conf/dss.properties + CONF_APPLICATION_YML=$SERVER_HOME/conf/application-dss.yml + ###install dss-workflow-server + installPackage + echo "" + echo "-----------------DSS install end--------------------" + echo "" + +} +ENV_FLAG="dev" +installDssProject \ No newline at end of file diff --git a/assembly/config/config.sh b/assembly/config/config.sh new file mode 100644 index 000000000..ba5307eac --- /dev/null +++ b/assembly/config/config.sh @@ -0,0 +1,81 @@ +### deploy user +deployUser=hadoop + +##微服务的最大内存使用量 +SERVER_HEAP_SIZE="512M" + + +### The install home path of DSS,Must provided +DSS_INSTALL_HOME=/appcom/Install/dss-dev + +### Linkis EUREKA information. # Microservices Service Registration Discovery Center +EUREKA_INSTALL_IP=127.0.0.1 +EUREKA_PORT=20303 + +### Specifies the user workspace, which is used to store the user's script files and log files. +### Generally local directory +#WORKSPACE_USER_ROOT_PATH=file:///tmp/linkis/ +#### Path to store job ResultSet:file or hdfs path +#RESULT_SET_ROOT_PATH=hdfs:///tmp/linkis + +### Linkis Gateway information +GATEWAY_INSTALL_IP=127.0.0.1 +GATEWAY_PORT=9001 + +#for azkaban +WDS_SCHEDULER_PATH=file:///appcom/tmp/wds/scheduler + +################### The install Configuration of all Micro-Services ##################### +# +# NOTICE: +# 1. If you just wanna try, the following micro-service configuration can be set without any settings. +# These services will be installed by default on this machine. +# 2. In order to get the most complete enterprise-level features, we strongly recommend that you install +# the following microservice parameters +# + +### DSS_SERVER +### This service is used to provide dss-server capability. + +### project-server +DSS_FRAMEWORK_PROJECT_SERVER_INSTALL_IP=127.0.0.1 +DSS_FRAMEWORK_PROJECT_SERVER_PORT=9002 +### orchestrator-server +DSS_FRAMEWORK_ORCHESTRATOR_SERVER_INSTALL_IP=127.0.0.1 +DSS_FRAMEWORK_ORCHESTRATOR_SERVER_PORT=9003 +### apiservice-server +DSS_APISERVICE_SERVER_INSTALL_IP=127.0.0.1 +DSS_APISERVICE_SERVER_PORT=9004 +### dss-workflow-server +DSS_WORKFLOW_SERVER_INSTALL_IP=127.0.0.1 +DSS_WORKFLOW_SERVER_PORT=9005 +### dss-flow-execution-server +DSS_FLOW_EXECUTION_SERVER_INSTALL_IP=127.0.0.1 +DSS_FLOW_EXECUTION_SERVER_PORT=9006 +###dss-datapipe-server +DSS_DATAPIPE_SERVER_INSTALL_IP=127.0.0.1 +DSS_DATAPIPE_SERVER_PORT=9008 + +############## ############## dss_appconn_instance configuration start ############## ############## +EVENTCHECKER_JDBC_URL="jdbc:mysql://127.0.0.1:3306/dss_linkis?characterEncoding=UTF-8" +EVENTCHECKER_JDBC_USERNAME=hadoop +EVENTCHECKER_JDBC_PASSWORD=hadoop + +DATACHECKER_JOB_JDBC_URL="jdbc:mysql://127.0.0.1:3306/hive_gz_bdap_test_01?useUnicode=true" +DATACHECKER_JOB_JDBC_USERNAME=hadoop +DATACHECKER_JOB_JDBC_PASSWORD=hadoop + +DATACHECKER_BDP_JDBC_URL="jdbc:mysql://127.0.0.1:3306/uat2_metastore?characterEncoding=UTF-8" +DATACHECKER_BDP_JDBC_USERNAME=hadoop +DATACHECKER_BDP_JDBC_PASSWORD=hadoop + +EMAIL_HOST=smtp.163.com +EMAIL_PORT=25 +EMAIL_USERNAME=xxx@163.com +EMAIL_PASSWORD=xxxxx +EMAIL_PROTOCOL=smtp +############## ############## dss_appconn_instance configuration end ############## ############## + +DSS_VERSION=1.0.0 + +DSS_FILE_NAME="dss-$DSS_VERSION" \ No newline at end of file diff --git a/assembly/config/db.sh b/assembly/config/db.sh new file mode 100644 index 000000000..148ae93a9 --- /dev/null +++ b/assembly/config/db.sh @@ -0,0 +1,8 @@ +### for DSS-Server and Eventchecker APPCONN +MYSQL_HOST=127.0.0.1 +MYSQL_PORT=3306 +MYSQL_DB=dss_dev +MYSQL_USER=hadoop +MYSQL_PASSWORD=hadoop + + diff --git a/assembly/dss-package/pom.xml b/assembly/dss-package/pom.xml new file mode 100644 index 000000000..59643ab2c --- /dev/null +++ b/assembly/dss-package/pom.xml @@ -0,0 +1,179 @@ + + + + + + dss + com.webank.wedatasphere.dss + 1.0.0 + + 4.0.0 + dss-package + + + com.webank.wedatasphere.linkis + linkis-rpc + ${linkis.version} + + + jackson-databind + com.fasterxml.jackson.core + + + jackson-core + com.fasterxml.jackson.core + + + jackson-core-asl + org.codehaus.jackson + + + httpclient + org.apache.httpcomponents + + + hibernate-validator + org.hibernate.validator + + + + + com.webank.wedatasphere.dss + dss-common + ${dss.version} + + + jackson-core + com.fasterxml.jackson.core + + + + + com.webank.wedatasphere.dss + dss-sender-service + ${dss.version} + + + + org.apache.httpcomponents + httpclient + ${httpclient.version} + + + + org.apache.httpcomponents + httpmime + ${httpmime.version} + + + commons-beanutils + commons-beanutils + ${beanutils.version} + + + + com.webank.wedatasphere.linkis + linkis-hadoop-common + ${linkis.version} + + + com.fasterxml.jackson.core + jackson-databind + ${fasterxml.jackson.version} + + + + org.apache.commons + commons-math3 + ${math3.version} + + + + org.apache.commons + commons-lang3 + ${commons.lang3.version} + + + + com.webank.wedatasphere.linkis + linkis-mybatis + ${linkis.version} + + + com.webank.wedatasphere.linkis + linkis-storage + ${linkis.version} + + + + org.glassfish.jersey.ext + jersey-bean-validation + 2.21 + + + + + + + + org.apache.maven.plugins + maven-install-plugin + + true + + + + org.apache.maven.plugins + maven-antrun-plugin + + + package + + run + + + + + + org.apache.maven.plugins + maven-assembly-plugin + 2.3 + + + dist + package + + single + + + false + out + false + false + + src/main/assembly/distribution.xml + + + + + + + + + \ No newline at end of file diff --git a/assembly/dss-package/src/main/assembly/distribution.xml b/assembly/dss-package/src/main/assembly/distribution.xml new file mode 100644 index 000000000..459fdb417 --- /dev/null +++ b/assembly/dss-package/src/main/assembly/distribution.xml @@ -0,0 +1,255 @@ + + + + dist + + dir + + true + dss-${dss.version} + + + + + + lib/dss-commons/ + true + true + false + true + true + + + + + + ${basedir}/../../ + . + + README* + LICENSE* + NOTICE* + + + + + + ${basedir}/../../conf + + conf + + **/* + + unix + + + + + ${basedir}/../../sbin + + sbin + + **/* + + unix + + + + + + ${basedir}/../../dss-framework/dss-framework-project-server/target/out/dss-framework-project-server/lib/ + + lib/dss-framework/dss-framework-project-server + + **/* + + + + + + + ${basedir}/../../dss-framework/dss-framework-orchestrator-server/target/out/dss-framework-orchestrator-server/lib/ + + lib/dss-framework/dss-framework-orchestrator-server/ + + **/* + + + + + + + ${basedir}/../../dss-apps/dss-apiservice-server/target/out/dss-apiservice-server/lib/ + + lib/dss-apps/dss-apiservice-server + + **/* + + + + + + + ${basedir}/../../dss-orchestrator/orchestrators/dss-workflow/dss-workflow-server/target/out/dss-workflow-server/lib/ + + lib/dss-orchestrator/dss-workflow-server/ + + **/* + + + + + + + ${basedir}/../../dss-orchestrator/orchestrators/dss-workflow/dss-flow-execution-server/target/out/dss-flow-execution-server/lib/ + + lib/dss-orchestrator/dss-flow-execution-server/ + + **/* + + + + + + + ${basedir}/../../dss-appconn/appconns/dss-datachecker-appconn/target/out/ + + dss-appconns + + **/* + + + + + + + ${basedir}/../../dss-appconn/appconns/dss-eventchecker-appconn/target/out/ + + dss-appconns + + **/* + + + + + + + ${basedir}/../../dss-appconn/appconns/dss-orchestrator-framework-appconn/target/out/ + + dss-appconns + + **/* + + + + + + + ${basedir}/../../dss-appconn/appconns/dss-workflow-appconn/target/out/ + + dss-appconns + + **/* + + + + + + + ${basedir}/../../dss-appconn/appconns/dss-schedulis-appconn/target/out/ + + dss-appconns + + **/* + + + + + + + ${basedir}/../../dss-appconn/appconns/dss-visualis-appconn/target/out/ + + dss-appconns + + **/* + + + + + + + + + + ${basedir}/../../dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/target/out/ + + dss-appconns + + **/* + + + + + + ${basedir}/../../dss-appconn/linkis-appconn-engineplugin/target/ + + dss-appconns + + *.zip + + + + + + ${basedir}/../../plugins/linkis/dss-gateway-support/target/ + + lib/dss-plugins/linkis/dss-gateway/ + + *${dss.version}.jar + + + + + + ${basedir}/../../plugins/azkaban/linkis-jobtype/target/ + + lib/dss-plugins/azkaban/linkis-jobtype/ + + *${dss.version}.zip + + + + + + + ${basedir}/../../dss-apps/dss-datapipe-server/target/out/dss-datapipe-server/lib/ + + lib/dss-apps/dss-datapipe-server + + **/* + + + + diff --git a/assembly/pom.xml b/assembly/pom.xml index 539581596..7ada30d10 100644 --- a/assembly/pom.xml +++ b/assembly/pom.xml @@ -1,9 +1,8 @@ - - lib - true - true - false - true - true - - + @@ -47,11 +34,13 @@ + + - ${project.parent.basedir}/conf/ + ${project.parent.basedir}/assembly/config/ - conf + config **/* @@ -60,7 +49,7 @@ - ${project.parent.basedir}/bin/ + ${project.parent.basedir}/assembly/bin/ bin @@ -71,7 +60,7 @@ - ${project.parent.basedir}/db/ + ${project.parent.basedir}/db db @@ -79,118 +68,16 @@ - + - ${project.parent.basedir}/dss-server/target/ + ${project.parent.basedir}/assembly/dss-package/target/out - share/dss/dss-server + - *.zip - - - - - - - - ${project.parent.basedir}/datachecker-appjoint/target/ - - share/appjoints/datachecker - - *.zip - - - - - - ${project.parent.basedir}/eventchecker-appjoint/target/ - - share/appjoints/eventchecker - - *.zip - - - - - - ${project.parent.basedir}/dss-azkaban-scheduler-appjoint/target/ - - share/appjoints/schedulis - - *.zip - - - - - - ${project.parent.basedir}/dss-flow-execution-entrance/target/ - - share/dss/dss-flow-execution-entrance - - *.zip - - - - - - - - ${project.parent.basedir}/visualis-appjoint/appjoint/target/ - - share/appjoints/visualis - - *.zip - - - - - - - ${project.parent.basedir}/qualitis-appjoint/appjoint/target/ - - share/appjoints/qualitis - - *.zip - - - - - - - ${project.parent.basedir}/visualis-appjoint/server/assembly/target/ - - share/visualis-server/ - - *.zip + **/* - - - - ${project.parent.basedir}/plugins/linkis/linkis-appjoint-entrance/target/ - - share/plugins/linkis/linkis-appjoint-entrance - - *.zip - - - - - - - - - - - - - - - - - - diff --git a/bin/checkServices.sh b/bin/checkServices.sh deleted file mode 100644 index 72df04be4..000000000 --- a/bin/checkServices.sh +++ /dev/null @@ -1,91 +0,0 @@ -# -# Copyright 2019 WeBank -# -# Licensed under the Apache License, Version 2.0 (the "License"); -# you may not use this file except in compliance with the License. -# You may obtain a copy of the License at -# -# http://www.apache.org/licenses/LICENSE-2.0 -# -# Unless required by applicable law or agreed to in writing, software -# distributed under the License is distributed on an "AS IS" BASIS, -# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -# See the License for the specific language governing permissions and -# limitations under the License. -# -#!/bin/sh -source ~/.bash_profile - -shellDir=`dirname $0` -workDir=`cd ${shellDir}/..;pwd` - -##load config -export LINKIS_DSS_CONF_FILE=${LINKIS_DSS_CONF_FILE:-"${workDir}/conf/config.sh"} -export DISTRIBUTION=${DISTRIBUTION:-"${workDir}/conf/config.sh"} -source ${LINKIS_DSS_CONF_FILE} -source ${DISTRIBUTION} - -MICRO_SERVICE_NAME=$1 -MICRO_SERVICE_IP=$2 -MICRO_SERVICE_PORT=$3 - -local_host="`hostname --fqdn`" - -ipaddr=$(ip addr | awk '/^[0-9]+: / {}; /inet.*global/ {print gensub(/(.*)\/(.*)/, "\\1", "g", $2)}'|awk 'NR==1') - -function isLocal(){ - if [ "$1" == "127.0.0.1" ];then - return 0 - elif [ $1 == "localhost" ]; then - return 0 - elif [ $1 == $local_host ]; then - return 0 - elif [ $1 == $ipaddr ]; then - return 0 - fi - return 1 -} - -function executeCMD(){ - isLocal $1 - flag=$? - echo "Is local "$flag - if [ $flag == "0" ];then - eval $2 - else - ssh -p $SSH_PORT $1 $2 - fi - -} - -#echo "<--------------------------------------------------------------------------->" -#echo "Start to Check if your microservice:$MICRO_SERVICE_NAME is normal via telnet" -#echo "" -#if ! executeCMD $SERVER_IP "test -e $DSS_INSTALL_HOME/$MICRO_SERVICE_NAME"; then -# echo "$MICRO_SERVICE_NAME is not installed,the check steps will be skipped" -# exit 0 -#fi -echo "===========================================================" -echo $MICRO_SERVICE_NAME -echo $MICRO_SERVICE_IP -echo $MICRO_SERVICE_PORT -echo "===========================================================" - -if [ $MICRO_SERVICE_NAME == "visualis-server" ] && [ $MICRO_SERVICE_IP == "127.0.0.1" ]; then - MICRO_SERVICE_IP=$ipaddr -fi - -result=`echo -e "\n" | telnet $MICRO_SERVICE_IP $MICRO_SERVICE_PORT 2>/dev/null | grep Connected | wc -l` -if [ $result -eq 1 ]; then - echo "$MICRO_SERVICE_NAME is ok." -else - echo "<--------------------------------------------------------------------------->" - echo "ERROR your $MICRO_SERVICE_NAME microservice is not start successful !!! ERROR logs as follows :" - echo "PLEAESE CHECK DETAIL LOG,LOCATION:$DSS_INSTALL_HOME/$MICRO_SERVICE_NAME/logs/linkis.out" - echo '<------------------------------------------------------------->' - executeCMD $MICRO_SERVICE_IP "tail -n 50 $DSS_INSTALL_HOME/$MICRO_SERVICE_NAME/logs/*.out" - echo '<-------------------------------------------------------------->' - echo "PLEAESE CHECK DETAIL LOG,LOCATION:$DSS_INSTALL_HOME/$MICRO_SERVICE_NAME/logs/linkis.out" - exit 1 -fi - diff --git a/bin/install.sh b/bin/install.sh deleted file mode 100644 index 170b03360..000000000 --- a/bin/install.sh +++ /dev/null @@ -1,583 +0,0 @@ -# -# Copyright 2019 WeBank -# -# Licensed under the Apache License, Version 2.0 (the "License"); -# you may not use this file except in compliance with the License. -# You may obtain a copy of the License at -# -# http://www.apache.org/licenses/LICENSE-2.0 -# -# Unless required by applicable law or agreed to in writing, software -# distributed under the License is distributed on an "AS IS" BASIS, -# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -# See the License for the specific language governing permissions and -# limitations under the License. -# -#!/bin/sh -#Actively load user env - -source ~/.bash_profile - -shellDir=`dirname $0` -workDir=`cd ${shellDir}/..;pwd` - -#To be compatible with MacOS and Linux -txt="" -if [[ "$OSTYPE" == "darwin"* ]]; then - txt="''" -elif [[ "$OSTYPE" == "linux-gnu" ]]; then - # linux - txt="" -elif [[ "$OSTYPE" == "cygwin" ]]; then - echo "dss not support Windows operating system" - exit 1 -elif [[ "$OSTYPE" == "msys" ]]; then - echo "dss not support Windows operating system" - exit 1 -elif [[ "$OSTYPE" == "win32" ]]; then - echo "dss not support Windows operating system" - exit 1 -elif [[ "$OSTYPE" == "freebsd"* ]]; then - txt="" -else - echo "Operating system unknown, please tell us(submit issue) for better service" - exit 1 -fi - -function isSuccess(){ -if [ $? -ne 0 ]; then - echo "Failed to " + $1 - exit 1 -else - echo "Succeed to" + $1 -fi -} - -function checkJava(){ - java -version - isSuccess "execute java --version" -} - -function checkExternalServer(){ -echo "telnet check for your $SERVER_NAME, if you wait for a long time,may be your $SERVER_NAME does not prepared" -result=`echo -e "\n" | telnet $EXTERNAL_SERVER_IP $EXTERNAL_SERVER_PORT 2>/dev/null | grep Connected | wc -l` -if [ $result -eq 1 ]; then - echo "$SERVER_NAME is OK." -else - echo "$SERVER_NAME is Bad. You need to prepare the' $SERVER_NAME ' environment in advance" - exit 1 -fi -} - - -say() { - printf 'check command fail \n %s\n' "$1" -} - -err() { - say "$1" >&2 - exit 1 -} - -check_cmd() { - command -v "$1" > /dev/null 2>&1 -} - -need_cmd() { - if ! check_cmd "$1"; then - err "need '$1' (command not found)" - fi -} - -#check env -sh ${workDir}/bin/checkEnv.sh -isSuccess "check env" - -##load config -echo "step1:load config" -export DSS_CONFIG_PATH=${DSS_CONFIG_PATH:-"${workDir}/conf/config.sh"} -export DSS_DB_CONFIG_PATH=${DSS_DB_CONFIG_PATH:-"${workDir}/conf/db.sh"} -export DISTRIBUTION=${DISTRIBUTION:-"${workDir}/conf/config.sh"} -source ${DSS_CONFIG_PATH} -source ${DSS_DB_CONFIG_PATH} -source ${DISTRIBUTION} -isSuccess "load config" - -local_host="`hostname --fqdn`" -ipaddr=$(ip addr | awk '/^[0-9]+: / {}; /inet.*global/ {print gensub(/(.*)\/(.*)/, "\\1", "g", $2)}'|awk 'NR==1') - -function isLocal(){ - if [ "$1" == "127.0.0.1" ];then - return 0 - elif [ $1 == "localhost" ]; then - return 0 - elif [ $1 == $local_host ]; then - return 0 - elif [ $1 == $ipaddr ]; then - return 0 - fi - return 1 -} - -function executeCMD(){ - isLocal $1 - flag=$? - if [ $flag == "0" ];then - echo "Is local execution:$2" - eval $2 - else - echo "Is remote execution:$2" - ssh -p $SSH_PORT $1 $2 - fi -} - -function copyFile(){ - isLocal $1 - flag=$? - src=$2 - dest=$3 - if [ $flag == "0" ];then - echo "Is local cp " - eval "cp -r $src $dest" - else - echo "Is remote cp " - scp -r -P $SSH_PORT $src $1:$dest - fi -} - -##install mode choice -if [ "$INSTALL_MODE" == "" ];then - echo "Please enter the mode selection such as: 1" - echo " 1: Lite" - echo " 2: Simple" - echo " 3: Standard" - echo "" - read -p "Please input the choice:" idx - INSTALL_MODE=$idx -fi - -if [[ '1' = "$INSTALL_MODE" ]];then - echo "You chose lite installation mode" - checkJava - SERVER_NAME=MYSQL - EXTERNAL_SERVER_IP=$MYSQL_HOST - EXTERNAL_SERVER_PORT=$MYSQL_PORT - checkExternalServer -elif [[ '2' = "$INSTALL_MODE" ]];then - echo "You chose sample installation mode" - checkJava - SERVER_NAME=MYSQL - EXTERNAL_SERVER_IP=$MYSQL_HOST - EXTERNAL_SERVER_PORT=$MYSQL_PORT - checkExternalServer -elif [[ '3' = "$INSTALL_MODE" ]];then - echo "You chose Standard installation mode" - #check for Java - checkJava - #check for mysql - SERVER_NAME=MYSQL - EXTERNAL_SERVER_IP=$MYSQL_HOST - EXTERNAL_SERVER_PORT=$MYSQL_PORT - checkExternalServer - #check qualitis serivice - SERVER_NAME=Qualitis - EXTERNAL_SERVER_IP=$QUALITIS_ADRESS_IP - EXTERNAL_SERVER_PORT=$QUALITIS_ADRESS_PORT - if [[ $IGNORECHECK = "" ]];then - checkExternalServer - fi - #check azkaban serivice - SERVER_NAME=AZKABAN - EXTERNAL_SERVER_IP=$AZKABAN_ADRESS_IP - EXTERNAL_SERVER_PORT=$AZKABAN_ADRESS_PORT - if [[ $IGNORECHECK = "" ]];then - checkExternalServer - fi -else - echo "no choice,exit!" - exit 1 -fi - -##init db -echo "Do you want to clear DSS table information in the database?" -echo " 1: Do not execute table-building statements" -echo " 2: Dangerous! Clear all data and rebuild the tables." -echo "" - -MYSQL_INSTALL_MODE=1 - -read -p "Please input the choice:" idx -if [[ '2' = "$idx" ]];then - MYSQL_INSTALL_MODE=2 - echo "You chose Rebuild the table" -elif [[ '1' = "$idx" ]];then - MYSQL_INSTALL_MODE=1 - echo "You chose not execute table-building statements" -else - echo "no choice,exit!" - exit 1 -fi - -echo "create hdfs directory and local directory" -if [ "$WORKSPACE_USER_ROOT_PATH" != "" ] -then - localRootDir=$WORKSPACE_USER_ROOT_PATH - if [[ $WORKSPACE_USER_ROOT_PATH == file://* ]];then - localRootDir=${WORKSPACE_USER_ROOT_PATH#file://} - mkdir -p $localRootDir/$deployUser - sudo chmod -R 775 $localRootDir/$deployUser - elif [[ $WORKSPACE_USER_ROOT_PATH == hdfs://* ]];then - localRootDir=${WORKSPACE_USER_ROOT_PATH#hdfs://} - hdfs dfs -mkdir -p $localRootDir/$deployUser - hdfs dfs -chmod -R 775 $localRootDir/$deployUser - else - echo "does not support $WORKSPACE_USER_ROOT_PATH filesystem types" - fi -isSuccess "create $WORKSPACE_USER_ROOT_PATH directory" -fi - - -if [ "$RESULT_SET_ROOT_PATH" != "" ] -then - localRootDir=$RESULT_SET_ROOT_PATH - if [[ $RESULT_SET_ROOT_PATH == file://* ]];then - localRootDir=${RESULT_SET_ROOT_PATH#file://} - mkdir -p $localRootDir/$deployUser - sudo chmod -R 775 $localRootDir/$deployUser - elif [[ $RESULT_SET_ROOT_PATH == hdfs://* ]];then - localRootDir=${RESULT_SET_ROOT_PATH#hdfs://} - hdfs dfs -mkdir -p $localRootDir/$deployUser - hdfs dfs -chmod -R 775 $localRootDir/$deployUser - else - echo "does not support $RESULT_SET_ROOT_PATH filesystem types" - fi -isSuccess "create $RESULT_SET_ROOT_PATH directory" -fi - - -if [ "$WDS_SCHEDULER_PATH" != "" ] -then - localRootDir=$WDS_SCHEDULER_PATH - if [[ $WDS_SCHEDULER_PATH == file://* ]];then - localRootDir=${WDS_SCHEDULER_PATH#file://} - mkdir -p $localRootDir - sudo chmod -R 775 $localRootDir - elif [[ $WDS_SCHEDULER_PATH == hdfs://* ]];then - localRootDir=${WDS_SCHEDULER_PATH#hdfs://} - hdfs dfs -mkdir -p $localRootDir - hdfs dfs -chmod -R 775 $localRootDir - else - echo "does not support $WDS_SCHEDULER_PATH filesystem types" - fi -isSuccess "create $WDS_SCHEDULER_PATH directory" -fi - - -##init db -if [[ '2' = "$MYSQL_INSTALL_MODE" ]];then - mysql -h$MYSQL_HOST -P$MYSQL_PORT -u$MYSQL_USER -p$MYSQL_PASSWORD -D$MYSQL_DB --default-character-set=utf8 -e "source ${workDir}/db/dss_ddl.sql" - isSuccess "source dss_ddl.sql" - LOCAL_IP=$ipaddr - if [ $GATEWAY_INSTALL_IP == "127.0.0.1" ];then - echo "GATEWAY_INSTALL_IP is equals 127.0.0.1 ,we will change it to ip address" - GATEWAY_INSTALL_IP_2=$LOCAL_IP - else - GATEWAY_INSTALL_IP_2=$GATEWAY_INSTALL_IP - fi - #echo $GATEWAY_INSTALL_IP_2 - sed -i "s/GATEWAY_INSTALL_IP_2/$GATEWAY_INSTALL_IP_2/g" ${workDir}/db/dss_dml.sql - sed -i "s/GATEWAY_PORT/$GATEWAY_PORT/g" ${workDir}/db/dss_dml.sql - mysql -h$MYSQL_HOST -P$MYSQL_PORT -u$MYSQL_USER -p$MYSQL_PASSWORD -D$MYSQL_DB --default-character-set=utf8 -e "source ${workDir}/db/dss_dml.sql" - isSuccess "source dss_dml.sql" - - if [[ '2' = "$INSTALL_MODE" ]] || [[ '3' = "$INSTALL_MODE" ]];then - echo "visualis support,visualis database will be initialized !" - if [ $VISUALIS_NGINX_IP == "127.0.0.1" ]||[ $VISUALIS_NGINX_IP == "0.0.0.0" ];then - echo "VISUALIS_NGINX_IP is equals $VISUALIS_NGINX_IP ,we will change it to ip address" - VISUALIS_NGINX_IP_2=$LOCAL_IP - else - VISUALIS_NGINX_IP_2=$VISUALIS_NGINX_IP - fi - #echo $VISUALIS_NGINX_IP_2 - sed -i "s/VISUALIS_NGINX_IP_2/$VISUALIS_NGINX_IP_2/g" ${workDir}/db/visualis.sql - sed -i "s/VISUALIS_NGINX_PORT/$VISUALIS_NGINX_PORT/g" ${workDir}/db/visualis.sql - mysql -h$MYSQL_HOST -P$MYSQL_PORT -u$MYSQL_USER -p$MYSQL_PASSWORD -D$MYSQL_DB --default-character-set=utf8 -e "source ${workDir}/db/visualis.sql" - isSuccess "source visualis.sql" - mysql -h$MYSQL_HOST -P$MYSQL_PORT -u$MYSQL_USER -p$MYSQL_PASSWORD -D$MYSQL_DB --default-character-set=utf8 -e "source ${workDir}/db/davinci.sql" - isSuccess "source davinci.sql" - fi - - if [[ '3' = "$INSTALL_MODE" ]];then - echo "azkaban and qualitis support, azkaban and qualitis database will be initialized !" - #azkaban - if [ $AZKABAN_ADRESS_IP == "127.0.0.1" ];then - echo "AZKABAN_ADRESS_IP is equals 127.0.0.1 ,we will change it to ip address" - AZKABAN_ADRESS_IP_2=$LOCAL_IP - else - AZKABAN_ADRESS_IP_2=$AZKABAN_ADRESS_IP - fi - echo $AZKABAN_ADRESS_IP_2 - sed -i "s/AZKABAN_ADRESS_IP_2/$AZKABAN_ADRESS_IP_2/g" ${workDir}/db/azkaban.sql - sed -i "s/AZKABAN_ADRESS_PORT/$AZKABAN_ADRESS_PORT/g" ${workDir}/db/azkaban.sql - mysql -h$MYSQL_HOST -P$MYSQL_PORT -u$MYSQL_USER -p$MYSQL_PASSWORD -D$MYSQL_DB --default-character-set=utf8 -e "source ${workDir}/db/azkaban.sql" - isSuccess "source azkaban.sql" - #qualitis - if [ $QUALITIS_ADRESS_IP == "127.0.0.1" ];then - echo "QUALITIS_ADRESS_IP is equals 127.0.0.1 ,we will change it to ip address" - QUALITIS_ADRESS_IP_2=$LOCAL_IP - else - QUALITIS_ADRESS_IP_2=$QUALITIS_ADRESS_IP - fi - echo $QUALITIS_ADRESS_IP_2 - sed -i "s/QUALITIS_ADRESS_IP_2/$QUALITIS_ADRESS_IP_2/g" ${workDir}/db/qualitis.sql - sed -i "s/QUALITIS_ADRESS_PORT/$QUALITIS_ADRESS_PORT/g" ${workDir}/db/qualitis.sql - mysql -h$MYSQL_HOST -P$MYSQL_PORT -u$MYSQL_USER -p$MYSQL_PASSWORD -D$MYSQL_DB --default-character-set=utf8 -e "source ${workDir}/db/qualitis.sql" - isSuccess "source qualitis.sql" - fi -fi - -##Deal special symbol '#' -HIVE_META_PASSWORD=$(echo ${HIVE_META_PASSWORD//'#'/'\#'}) -MYSQL_PASSWORD=$(echo ${MYSQL_PASSWORD//'#'/'\#'}) - -###linkis Eurkea info -SERVER_IP=$EUREKA_INSTALL_IP -SERVER_PORT=$EUREKA_PORT -SERVER_HOME=$DSS_INSTALL_HOME - -if test -z "$SERVER_IP" -then - SERVER_IP=$local_host -fi -EUREKA_URL=http://$SERVER_IP:$EUREKA_PORT/eureka/ - -##function start -function installPackage(){ -echo "start to install $SERVERNAME" -echo "$SERVERNAME-step1: create dir" -if test -z "$SERVER_IP" -then - SERVER_IP=$local_host -fi - -if ! executeCMD $SERVER_IP "test -e $SERVER_HOME"; then - executeCMD $SERVER_IP "sudo mkdir -p $SERVER_HOME;sudo chown -R $deployUser:$deployUser $SERVER_HOME" - isSuccess "create the dir of $SERVERNAME" -fi - -echo "$SERVERNAME-step2:copy install package" -copyFile $SERVER_IP ${workDir}/share/$PACKAGE_DIR/$SERVERNAME.zip $SERVER_HOME - -if ! executeCMD $SERVER_IP "test -e $SERVER_HOME/lib"; then - copyFile $SERVER_IP ${workDir}/lib $SERVER_HOME -fi - -#copyFile $SERVER_IP ${workDir}/lib $SERVER_HOME -isSuccess "copy ${SERVERNAME}.zip" -executeCMD $SERVER_IP "cd $SERVER_HOME/;rm -rf $SERVERNAME-bak; mv -f $SERVERNAME $SERVERNAME-bak" -executeCMD $SERVER_IP "cd $SERVER_HOME/;unzip $SERVERNAME.zip > /dev/null" -executeCMD $SERVER_IP "cd $SERVER_HOME/;scp -r lib/* $SERVER_HOME/$SERVERNAME/lib" -isSuccess "unzip ${SERVERNAME}.zip" - -echo "$SERVERNAME-step3:subsitution conf" -SERVER_CONF_PATH=$SERVER_HOME/$SERVERNAME/conf/application.yml -executeCMD $SERVER_IP "sed -i \"s#port:.*#port: $SERVER_PORT#g\" $SERVER_CONF_PATH" -executeCMD $SERVER_IP "sed -i \"s#defaultZone:.*#defaultZone: $EUREKA_URL#g\" $SERVER_CONF_PATH" -executeCMD $SERVER_IP "sed -i \"s#hostname:.*#hostname: $SERVER_IP#g\" $SERVER_CONF_PATH" -isSuccess "subsitution conf of $SERVERNAME" -} -##function end - -##function start -function installVisualis(){ -echo "start to install $SERVERNAME" -echo "$SERVERNAME-step1: create dir" -if test -z "$SERVER_IP" -then - SERVER_IP=$local_host -fi - -if ! executeCMD $SERVER_IP "test -e $SERVER_HOME"; then - executeCMD $SERVER_IP "sudo mkdir -p $SERVER_HOME;sudo chown -R $deployUser:$deployUser $SERVER_HOME" - isSuccess "create the dir of $SERVERNAME" -fi - -echo "$SERVERNAME-step2:copy install package" -copyFile $SERVER_IP ${workDir}/share/$PACKAGE_DIR/$SERVERNAME.zip $SERVER_HOME -isSuccess "copy ${SERVERNAME}.zip" -executeCMD $SERVER_IP "cd $SERVER_HOME/;rm -rf $SERVERNAME-bak; mv -f $SERVERNAME $SERVERNAME-bak" -executeCMD $SERVER_IP "cd $SERVER_HOME/;unzip $SERVERNAME.zip > /dev/null" -isSuccess "unzip ${SERVERNAME}.zip" -} -##function end - - -##function start -function installAppjoints(){ -echo "start to install $APPJOINTNAME" -echo "$APPJOINTNAME Install-step1: create dir" -if test -z "$SERVER_IP" -then - SERVER_IP=$local_host -fi - -if ! executeCMD $SERVER_IP "test -e $SERVER_HOME/$APPJOINTPARENT"; then - executeCMD $SERVER_IP "sudo mkdir -p $SERVER_HOME/$APPJOINTPARENT;sudo chown -R $deployUser:$deployUser $SERVER_HOME/$APPJOINTPARENT" - isSuccess "create the dir of $SERVER_HOME/$APPJOINTPARENT;" -fi - -echo "$APPJOINTNAME-step2:copy install package" -copyFile $SERVER_IP $workDir/share/appjoints/$APPJOINTNAME/*.zip $SERVER_HOME/$APPJOINTPARENT -isSuccess "copy ${APPJOINTNAME}.zip" -executeCMD $SERVER_IP "cd $SERVER_HOME/$APPJOINTPARENT/;unzip -o dss-*-appjoint.zip > /dev/null;rm -rf dss-*-appjoint.zip" -isSuccess "install ${APPJOINTNAME}.zip" -} -##function end - -##dss-Server install -PACKAGE_DIR=dss/dss-server -SERVERNAME=dss-server -SERVER_IP=$DSS_SERVER_INSTALL_IP -SERVER_PORT=$DSS_SERVER_PORT -SERVER_HOME=$DSS_INSTALL_HOME -###install Dss-Server -installPackage -###update Dss-Server linkis.properties -echo "$SERVERNAME-step4:update linkis.properties" -SERVER_CONF_PATH=$SERVER_HOME/$SERVERNAME/conf/linkis.properties -executeCMD $SERVER_IP "sed -i \"s#wds.linkis.server.mybatis.datasource.url.*#wds.linkis.server.mybatis.datasource.url=jdbc:mysql://${MYSQL_HOST}:${MYSQL_PORT}/${MYSQL_DB}?characterEncoding=UTF-8#g\" $SERVER_CONF_PATH" -executeCMD $SERVER_IP "sed -i \"s#wds.linkis.server.mybatis.datasource.username.*#wds.linkis.server.mybatis.datasource.username=$MYSQL_USER#g\" $SERVER_CONF_PATH" -executeCMD $SERVER_IP "sed -i \"s#wds.linkis.server.mybatis.datasource.password.*#***REMOVED***$MYSQL_PASSWORD#g\" $SERVER_CONF_PATH" -executeCMD $SERVER_IP "sed -i \"s#wds.dss.appjoint.scheduler.azkaban.address.*#wds.dss.appjoint.scheduler.azkaban.address=http://${AZKABAN_ADRESS_IP}:${AZKABAN_ADRESS_PORT}#g\" $SERVER_CONF_PATH" -executeCMD $SERVER_IP "sed -i \"s#wds.linkis.gateway.ip.*#wds.linkis.gateway.ip=$GATEWAY_INSTALL_IP#g\" $SERVER_CONF_PATH" -executeCMD $SERVER_IP "sed -i \"s#wds.linkis.gateway.port.*#wds.linkis.gateway.port=$GATEWAY_PORT#g\" $SERVER_CONF_PATH" -executeCMD $SERVER_IP "sed -i \"s#wds.dss.appjoint.scheduler.project.store.dir.*#wds.dss.appjoint.scheduler.project.store.dir=$WDS_SCHEDULER_PATH#g\" $SERVER_CONF_PATH" -executeCMD $SERVER_IP "echo "$deployUser=$deployUser" >> $SERVER_HOME/$SERVERNAME/conf/token.properties" -isSuccess "subsitution linkis.properties of $SERVERNAME" -echo "<----------------$SERVERNAME:end------------------->" -echo "" - -if [[ '2' = "$INSTALL_MODE" ]]||[[ '3' = "$INSTALL_MODE" ]];then -##Flow execution Install -PACKAGE_DIR=dss/dss-flow-execution-entrance -SERVERNAME=dss-flow-execution-entrance -SERVER_IP=$FLOW_EXECUTION_INSTALL_IP -SERVER_PORT=$FLOW_EXECUTION_PORT -SERVER_HOME=$DSS_INSTALL_HOME -###Install flow execution -installPackage -###Update flow execution linkis.properties -echo "$SERVERNAME-step4:update linkis.properties" -SERVER_CONF_PATH=$SERVER_HOME/$SERVERNAME/conf/linkis.properties -executeCMD $SERVER_IP "sed -i \"s#wds.linkis.entrance.config.logPath.*#wds.linkis.entrance.config.logPath=$WORKSPACE_USER_ROOT_PATH#g\" $SERVER_CONF_PATH" -executeCMD $SERVER_IP "sed -i \"s#wds.linkis.resultSet.store.path.*#wds.linkis.resultSet.store.path=$RESULT_SET_ROOT_PATH#g\" $SERVER_CONF_PATH" -executeCMD $SERVER_IP "sed -i \"s#wds.linkis.gateway.url.*#wds.linkis.gateway.url=http://${GATEWAY_INSTALL_IP}:${GATEWAY_PORT}#g\" $SERVER_CONF_PATH" -isSuccess "subsitution linkis.properties of $SERVERNAME" -echo "<----------------$SERVERNAME:end------------------->" -echo "" -##Appjoint entrance Install -PACKAGE_DIR=plugins/linkis/linkis-appjoint-entrance -SERVERNAME=linkis-appjoint-entrance -SERVER_IP=$APPJOINT_ENTRANCE_INSTALL_IP -SERVER_PORT=$APPJOINT_ENTRANCE_PORT -SERVER_HOME=$DSS_INSTALL_HOME -###Install appjoint entrance -installPackage -###Update appjoint entrance linkis.properties -echo "$SERVERNAME-step4:update linkis.properties" -SERVER_CONF_PATH=$SERVER_HOME/$SERVERNAME/conf/linkis.properties -executeCMD $SERVER_IP "sed -i \"s#wds.linkis.entrance.config.logPath.*#wds.linkis.entrance.config.logPath=$WORKSPACE_USER_ROOT_PATH#g\" $SERVER_CONF_PATH" -executeCMD $SERVER_IP "sed -i \"s#wds.linkis.resultSet.store.path.*#wds.linkis.resultSet.store.path=$RESULT_SET_ROOT_PATH#g\" $SERVER_CONF_PATH" -isSuccess "subsitution linkis.properties of $SERVERNAME" -echo "<----------------$SERVERNAME:end------------------->" -echo "" -##visualis-server Install -PACKAGE_DIR=visualis-server -SERVERNAME=visualis-server -SERVER_IP=$VISUALIS_SERVER_INSTALL_IP -SERVER_PORT=$VISUALIS_SERVER_PORT -SERVER_HOME=$DSS_INSTALL_HOME -###install visualis-server -installVisualis -###update visualis-server linkis.properties -echo "$SERVERNAME-step4:update linkis.properties" -SERVER_CONF_PATH=$SERVER_HOME/$SERVERNAME/conf/linkis.properties -if [ $VISUALIS_NGINX_IP == "127.0.0.1" ]||[ $VISUALIS_NGINX_IP == "0.0.0.0" ]; then - VISUALIS_NGINX_IP=$ipaddr -fi -if [ $VISUALIS_SERVER_INSTALL_IP == "127.0.0.1" ]||[ $VISUALIS_SERVER_INSTALL_IP == "0.0.0.0" ]; then - VISUALIS_SERVER_INSTALL_IP=$ipaddr -fi -executeCMD $SERVER_IP "sed -i \"s#wds.linkis.entrance.config.logPath.*#wds.linkis.entrance.config.logPath=$WORKSPACE_USER_ROOT_PATH#g\" $SERVER_CONF_PATH" -executeCMD $SERVER_IP "sed -i \"s#wds.linkis.resultSet.store.path.*#wds.linkis.resultSet.store.path=$RESULT_SET_ROOT_PATH#g\" $SERVER_CONF_PATH" -executeCMD $SERVER_IP "sed -i \"s#wds.dss.visualis.gateway.ip.*#wds.dss.visualis.gateway.ip=$GATEWAY_INSTALL_IP#g\" $SERVER_CONF_PATH" -executeCMD $SERVER_IP "sed -i \"s#wds.dss.visualis.gateway.port.*#wds.dss.visualis.gateway.port=$GATEWAY_PORT#g\" $SERVER_CONF_PATH" -SERVER_CONF_PATH=$SERVER_HOME/$SERVERNAME/conf/application.yml -executeCMD $SERVER_IP "sed -i \"s#address: 127.0.0.1#address: $VISUALIS_SERVER_INSTALL_IP#g\" $SERVER_CONF_PATH" -executeCMD $SERVER_IP "sed -i \"s#port: 9007#port: $VISUALIS_SERVER_PORT#g\" $SERVER_CONF_PATH" -executeCMD $SERVER_IP "sed -i \"s#url: http://0.0.0.0:0000/dss/visualis#url: http://$VISUALIS_NGINX_IP:$VISUALIS_NGINX_PORT/dss/visualis#g\" $SERVER_CONF_PATH" -executeCMD $SERVER_IP "sed -i \"s#address: 0.0.0.0#address: $VISUALIS_NGINX_IP#g\" $SERVER_CONF_PATH" -executeCMD $SERVER_IP "sed -i \"s#port: 0000#port: $VISUALIS_NGINX_PORT#g\" $SERVER_CONF_PATH" -executeCMD $SERVER_IP "sed -i \"s#defaultZone: http://127.0.0.1:20303/eureka/#defaultZone: http://$EUREKA_INSTALL_IP:$EUREKA_PORT/eureka/#g\" $SERVER_CONF_PATH" -executeCMD $SERVER_IP "sed -i \"s#url: jdbc:mysql://127.0.0.1:3306/xxx?characterEncoding=UTF-8#url: jdbc:mysql://$MYSQL_HOST:$MYSQL_PORT/$MYSQL_DB?characterEncoding=UTF-8#g\" $SERVER_CONF_PATH" -executeCMD $SERVER_IP "sed -i \"s#username: xxx#username: $MYSQL_USER#g\" $SERVER_CONF_PATH" -executeCMD $SERVER_IP "sed -i \"s#password: xxx#password: $MYSQL_PASSWORD#g\" $SERVER_CONF_PATH" -isSuccess "subsitution linkis.properties of $SERVERNAME" -echo "<----------------$SERVERNAME:end------------------->" -echo "" -#APPJOINTS INSTALL -echo "<----------------datachecker appjoint install start------------------->" -APPJOINTPARENT=dss-appjoints -APPJOINTNAME=datachecker -#datachecker appjoint install -installAppjoints -echo "$APPJOINTNAME:subsitution conf" -APPJOINTNAME_CONF_PATH_PATENT=$SERVER_HOME/$APPJOINTPARENT/$APPJOINTNAME/appjoint.properties -executeCMD $SERVER_IP "sed -i \"s#job.datachecker.jdo.option.url.*#job.datachecker.jdo.option.url=$HIVE_META_URL#g\" $APPJOINTNAME_CONF_PATH_PATENT" -executeCMD $SERVER_IP "sed -i \"s#job.datachecker.jdo.option.username.*#job.datachecker.jdo.option.username=$HIVE_META_USER#g\" $APPJOINTNAME_CONF_PATH_PATENT" -executeCMD $SERVER_IP "sed -i \"s#job.datachecker.jdo.option.password.*#job.datachecker.jdo.option.password=$HIVE_META_PASSWORD#g\" $APPJOINTNAME_CONF_PATH_PATENT" -isSuccess "subsitution conf of datachecker" -echo "<----------------datachecker appjoint install end------------------->" -echo "" -echo "<----------------eventchecker appjoint install start------------------->" -APPJOINTPARENT=dss-appjoints -APPJOINTNAME=eventchecker -#eventchecker appjoint install -installAppjoints -echo "$APPJOINTNAME:subsitution conf" -APPJOINTNAME_CONF_PATH_PATENT=$SERVER_HOME/$APPJOINTPARENT/$APPJOINTNAME/appjoint.properties -executeCMD $SERVER_IP "sed -i \"s#msg.eventchecker.jdo.option.url.*#msg.eventchecker.jdo.option.url=jdbc:mysql://${MYSQL_HOST}:${MYSQL_PORT}/${MYSQL_DB}?characterEncoding=UTF-8#g\" $APPJOINTNAME_CONF_PATH_PATENT" -executeCMD $SERVER_IP "sed -i \"s#msg.eventchecker.jdo.option.username.*#msg.eventchecker.jdo.option.username=$MYSQL_USER#g\" $APPJOINTNAME_CONF_PATH_PATENT" -executeCMD $SERVER_IP "sed -i \"s#msg.eventchecker.jdo.option.password.*#msg.eventchecker.jdo.option.password=$MYSQL_PASSWORD#g\" $APPJOINTNAME_CONF_PATH_PATENT" -isSuccess "subsitution conf of eventchecker" -echo "<----------------$APPJOINTNAME:end------------------->" -echo "" -echo "<----------------visualis appjoint install start------------------->" -APPJOINTPARENT=dss-appjoints -APPJOINTNAME=visualis -#visualis appjoint install -installAppjoints -echo "<----------------$APPJOINTNAME:end------------------->" -fi - -##lite and sample version does not install qualitis APPJoint and scheduis APPJoint -if [[ '3' = "$INSTALL_MODE" ]];then -echo "" -echo "<----------------qualitis appjoint install start------------------->" -APPJOINTPARENT=dss-appjoints -APPJOINTNAME=qualitis -#qualitis appjoint install -installAppjoints -APPJOINTNAME_CONF_PATH_PATENT=$SERVER_HOME/$APPJOINTPARENT/$APPJOINTNAME/appjoint.properties -executeCMD $SERVER_IP "sed -i \"s#baseUrl=http://127.0.0.1:8090#baseUrl=http://$QUALITIS_ADRESS_IP:$QUALITIS_ADRESS_PORT#g\" $APPJOINTNAME_CONF_PATH_PATENT" -isSuccess "subsitution conf of qualitis" -echo "<----------------$APPJOINTNAME:end------------------->" -echo "" -echo "<----------------schedulis appjoint install start------------------->" -APPJOINTPARENT=dss-appjoints -APPJOINTNAME=schedulis -#schedulis appjoint install -installAppjoints -isSuccess "subsitution conf of schedulis" -echo "<----------------$APPJOINTNAME:end------------------->" -fi diff --git a/bin/start-all.sh b/bin/start-all.sh deleted file mode 100644 index 98a07f6bd..000000000 --- a/bin/start-all.sh +++ /dev/null @@ -1,189 +0,0 @@ -#!/usr/bin/env bash -# -# Copyright 2019 WeBank -# -# Licensed under the Apache License, Version 2.0 (the "License"); -# you may not use this file except in compliance with the License. -# You may obtain a copy of the License at -# -# http://www.apache.org/licenses/LICENSE-2.0 -# -# Unless required by applicable law or agreed to in writing, software -# distributed under the License is distributed on an "AS IS" BASIS, -# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -# See the License for the specific language governing permissions and -# limitations under the License. -# - -# Start all dss applications -info="We will start all dss applications, it will take some time, please wait" -echo ${info} - -#Actively load user env -source /etc/profile -source ~/.bash_profile - -shellDir=`dirname $0` - -workDir=`cd ${shellDir}/..;pwd` - -CONF_DIR="${workDir}"/conf - -export LINKIS_DSS_CONF_FILE=${LINKIS_DSS_CONF_FILE:-"${CONF_DIR}/config.sh"} -export DISTRIBUTION=${DISTRIBUTION:-"${CONF_DIR}/config.sh"} -source $LINKIS_DSS_CONF_FILE -source ${DISTRIBUTION} -function isSuccess(){ -if [ $? -ne 0 ]; then - echo "ERROR: " + $1 - exit 1 -else - echo "INFO:" + $1 -fi -} -local_host="`hostname --fqdn`" - -ipaddr=$(ip addr | awk '/^[0-9]+: / {}; /inet.*global/ {print gensub(/(.*)\/(.*)/, "\\1", "g", $2)}'|awk 'NR==1') - -function isLocal(){ - if [ "$1" == "127.0.0.1" ];then - return 0 - elif [ $1 == "localhost" ]; then - return 0 - elif [ $1 == $local_host ]; then - return 0 - elif [ $1 == $ipaddr ]; then - return 0 - fi - return 1 -} - -function executeCMD(){ - isLocal $1 - flag=$? - echo "Is local "$flag - if [ $flag == "0" ];then - eval $2 - else - ssh -p $SSH_PORT $1 $2 - fi - -} - -#if there is no LINKIS_INSTALL_HOME,we need to source config again -if [ -z ${DSS_INSTALL_HOME} ];then - echo "Warning: DSS_INSTALL_HOME does not exist, we will source config" - if [ ! -f "${LINKIS_DSS_CONF_FILE}" ];then - echo "Error: can not find config file, start applications failed" - exit 1 - else - source ${LINKIS_DSS_CONF_FILE} - fi -fi - -function startApp(){ -echo "<-------------------------------->" -echo "Begin to start $SERVER_NAME" -SERVER_BIN=${DSS_INSTALL_HOME}/${SERVER_NAME}/bin -#echo $SERVER_BIN -SERVER_LOCAL_START_CMD="dos2unix ${SERVER_BIN}/* > /dev/null 2>&1; dos2unix ${SERVER_BIN}/../conf/* > /dev/null 2>&1;sh ${SERVER_BIN}/start-${SERVER_NAME}.sh > /dev/null 2>&1 &" -SERVER_REMOTE_START_CMD="source /etc/profile;source ~/.bash_profile;cd ${SERVER_BIN}; dos2unix ./* > /dev/null 2>&1; dos2unix ../conf/* > /dev/null 2>&1; sh start-${SERVER_NAME}.sh > /dev/null 2>&1" - -if test -z "$SERVER_IP" -then - SERVER_IP=$local_host -fi - -if ! executeCMD $SERVER_IP "test -e $SERVER_BIN"; then - echo "<-------------------------------->" - echo "$SERVER_NAME is not installed,the start steps will be skipped" - echo "<-------------------------------->" - return -fi - -isLocal $SERVER_IP -flag=$? -echo "Is local "$flag -if [ $flag == "0" ];then - eval $SERVER_LOCAL_START_CMD -else - ssh -p $SSH_PORT $SERVER_IP $SERVER_REMOTE_START_CMD -fi -isSuccess "End to start $SERVER_NAME" -echo "<-------------------------------->" -sleep 15 #for Eureka register -} - -#dss-server -SERVER_NAME=dss-server -SERVER_IP=$DSS_SERVER_INSTALL_IP -startApp - -#dss-flow-execution-entrance -SERVER_NAME=dss-flow-execution-entrance -SERVER_IP=$FLOW_EXECUTION_INSTALL_IP -startApp - -#dss-flow-execution-entrance -SERVER_NAME=linkis-appjoint-entrance -SERVER_IP=$APPJOINT_ENTRANCE_INSTALL_IP -startApp - -#visualis-server -SERVER_NAME=visualis-server -SERVER_IP=$VISUALIS_SERVER_INSTALL_IP -startApp - -echo "" -echo "Start to check all dss microservice" -echo "" - -function checkServer(){ -echo "<-------------------------------->" -echo "Begin to check $SERVER_NAME" -if test -z "$SERVER_IP" -then - SERVER_IP=$local_host -fi - -SERVER_BIN=${SERVER_HOME}/${SERVER_NAME}/bin - -if ! executeCMD $SERVER_IP "test -e ${DSS_INSTALL_HOME}/${SERVER_NAME}"; then - echo "$SERVER_NAME is not installed,the checkServer steps will be skipped" - return -fi - -sh $workDir/bin/checkServices.sh $SERVER_NAME $SERVER_IP $SERVER_PORT -isSuccess "start $SERVER_NAME " -sleep 3 -echo "<-------------------------------->" -} - -#check dss-server -SERVER_NAME=dss-server -SERVER_IP=$DSS_SERVER_INSTALL_IP -SERVER_PORT=$DSS_SERVER_PORT -checkServer - - -#check dss-flow-execution-entrance -SERVER_NAME=dss-flow-execution-entrance -SERVER_IP=$FLOW_EXECUTION_INSTALL_IP -SERVER_PORT=$FLOW_EXECUTION_PORT -checkServer - -#check linkis-appjoint-entrance -SERVER_NAME=linkis-appjoint-entrance -SERVER_IP=$APPJOINT_ENTRANCE_INSTALL_IP -SERVER_PORT=$APPJOINT_ENTRANCE_PORT -checkServer - - -#check visualis-server -sleep 10 #visualis service need more time to register -SERVER_NAME=visualis-server -SERVER_IP=$VISUALIS_SERVER_INSTALL_IP -SERVER_PORT=$VISUALIS_SERVER_PORT -checkServer - -echo "DSS started successfully" diff --git a/bin/stop-all.sh b/bin/stop-all.sh deleted file mode 100644 index 838b9babc..000000000 --- a/bin/stop-all.sh +++ /dev/null @@ -1,133 +0,0 @@ -#!/usr/bin/env bash -# -# Copyright 2019 WeBank -# -# Licensed under the Apache License, Version 2.0 (the "License"); -# you may not use this file except in compliance with the License. -# You may obtain a copy of the License at -# -# http://www.apache.org/licenses/LICENSE-2.0 -# -# Unless required by applicable law or agreed to in writing, software -# distributed under the License is distributed on an "AS IS" BASIS, -# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -# See the License for the specific language governing permissions and -# limitations under the License. -# - - - -# Start all dss applications -info="We will stop all dss applications, it will take some time, please wait" -echo ${info} - -#Actively load user env -source ~/.bash_profile - -workDir=`dirname "${BASH_SOURCE-$0}"` -workDir=`cd "$workDir"; pwd` - - -CONF_DIR="${workDir}"/../conf -export LINKIS_DSS_CONF_FILE=${LINKIS_DSS_CONF_FILE:-"${CONF_DIR}/config.sh"} -export DISTRIBUTION=${DISTRIBUTION:-"${CONF_DIR}/config.sh"} -source ${DISTRIBUTION} - -local_host="`hostname --fqdn`" -ipaddr=$(ip addr | awk '/^[0-9]+: / {}; /inet.*global/ {print gensub(/(.*)\/(.*)/, "\\1", "g", $2)}'|awk 'NR==1') - -function isSuccess(){ -if [ $? -ne 0 ]; then - echo "ERROR: " + $1 - exit 1 -else - echo "INFO:" + $1 -fi -} - -function isLocal(){ - if [ "$1" == "127.0.0.1" ];then - return 0 - elif [ $1 == "localhost" ]; then - return 0 - elif [ $1 == $local_host ]; then - return 0 - elif [ $1 == $ipaddr ]; then - return 0 - fi - return 1 -} - -function executeCMD(){ - isLocal $1 - flag=$? - echo "Is local "$flag - if [ $flag == "0" ];then - eval $2 - else - ssh -p $SSH_PORT $1 $2 - fi - -} - - -#if there is no LINKIS_INSTALL_HOME,we need to source config again -if [ -z ${DSS_INSTALL_HOME} ];then - echo "Warning: DSS_INSTALL_HOME does not exist, we will source config" - if [ ! -f "${LINKIS_DSS_CONF_FILE}" ];then - echo "Error: can not find config file, stop applications failed" - exit 1 - else - source ${LINKIS_DSS_CONF_FILE} - fi -fi - -function stopAPP(){ -echo "<-------------------------------->" -echo "Begin to stop $SERVER_NAME" -SERVER_BIN=${DSS_INSTALL_HOME}/${SERVER_NAME}/bin -SERVER_LOCAL_STOP_CMD="sh ${SERVER_BIN}/stop-${SERVER_NAME}.sh" -SERVER_REMOTE_STOP_CMD="source /etc/profile;source ~/.bash_profile;cd ${SERVER_BIN}; sh stop-${SERVER_NAME}.sh " -if test -z "$SERVER_IP" -then - SERVER_IP=$local_host -fi - -if ! executeCMD $SERVER_IP "test -e ${DSS_INSTALL_HOME}/${SERVER_NAME}"; then - echo "$SERVER_NAME is not installed,the stop steps will be skipped" - return -fi - -isLocal $SERVER_IP -flag=$? -echo "Is local "$flag -if [ $flag == "0" ];then - eval $SERVER_LOCAL_STOP_CMD -else - ssh -p $SSH_PORT $SERVER_IP $SERVER_REMOTE_STOP_CMD -fi -echo "<-------------------------------->" -sleep 3 -} - -#dss-server -SERVER_NAME=dss-server -SERVER_IP=$DSS_SERVER_INSTALL_IP -stopAPP - -#dss-flow-execution-entrance -SERVER_NAME=dss-flow-execution-entrance -SERVER_IP=$FLOW_EXECUTION_INSTALL_IP -stopAPP - -#dss-flow-execution-entrance -SERVER_NAME=linkis-appjoint-entrance -SERVER_IP=$APPJOINT_ENTRANCE_INSTALL_IP -stopAPP - -#visualis-server -SERVER_NAME=visualis-server -SERVER_IP=$VISUALIS_SERVER_INSTALL_IP -stopAPP - -echo "stop-all shell script executed completely" diff --git a/conf/application-dss.yml b/conf/application-dss.yml new file mode 100644 index 000000000..b0473838d --- /dev/null +++ b/conf/application-dss.yml @@ -0,0 +1,23 @@ + +eureka: + client: + serviceUrl: + defaultZone: http://127.0.0.1:20303/eureka/ + #instance: + #prefer-ip-address: true + #instance-id: ${spring.cloud.client.ip-address}:${server.port} + #metadata-map: + #test: wedatasphere + +management: + endpoints: + web: + exposure: + include: refresh,info +logging: + config: classpath:log4j2.xml + +#mybatis: +# configuration: +# log-impl: org.apache.ibatis.logging.stdout.StdOutImpl + diff --git a/conf/config.sh b/conf/config.sh deleted file mode 100644 index 74c913ddc..000000000 --- a/conf/config.sh +++ /dev/null @@ -1,80 +0,0 @@ -#!/bin/sh - -shellDir=`dirname $0` -workDir=`cd ${shellDir}/..;pwd` - -### deploy user -deployUser=hadoop - -### The install home path of DSS,Must provided -DSS_INSTALL_HOME=$workDir - -### Specifies the user workspace, which is used to store the user's script files and log files. -### Generally local directory -WORKSPACE_USER_ROOT_PATH=file:///tmp/linkis/ -### Path to store job ResultSet:file or hdfs path -RESULT_SET_ROOT_PATH=hdfs:///tmp/linkis - -################### The install Configuration of all Micro-Services ##################### -# -# NOTICE: -# 1. If you just wanna try, the following micro-service configuration can be set without any settings. -# These services will be installed by default on this machine. -# 2. In order to get the most complete enterprise-level features, we strongly recommend that you install -# the following microservice parameters -# - -### DSS_SERVER -### This service is used to provide dss-server capability. -DSS_SERVER_INSTALL_IP=127.0.0.1 -DSS_SERVER_PORT=9004 - -### Appjoint-Entrance -### This service is used to provide Appjoint-Entrance capability. -APPJOINT_ENTRANCE_INSTALL_IP=127.0.0.1 -APPJOINT_ENTRANCE_PORT=9005 - -### Flow-Execution-Entrance -### This service is used to provide flow execution capability. -FLOW_EXECUTION_INSTALL_IP=127.0.0.1 -FLOW_EXECUTION_PORT=9006 - -### Linkis EUREKA information. -EUREKA_INSTALL_IP=127.0.0.1 # Microservices Service Registration Discovery Center -EUREKA_PORT=20303 - -### Linkis Gateway information -GATEWAY_INSTALL_IP=127.0.0.1 -GATEWAY_PORT=9001 - -### SSH Port -SSH_PORT=22 - -### 1、DataCheck APPJOINT,This service is used to provide DataCheck capability. -HIVE_META_URL=jdbc:mysql://127.0.0.1:3306/hivemeta?characterEncoding=UTF-8 -HIVE_META_USER=xxx -HIVE_META_PASSWORD=xxx - -#Used to store the azkaban project transformed by DSS -WDS_SCHEDULER_PATH=file:///appcom/tmp/wds/scheduler - -###The IP address and port are written into the database here, so be sure to plan ahead -## visualis-server -VISUALIS_SERVER_INSTALL_IP=127.0.0.1 -VISUALIS_SERVER_PORT=9007 -### visualis nginx acess ip,keep consistent with DSS front end -VISUALIS_NGINX_IP=127.0.0.1 -VISUALIS_NGINX_PORT=8088 - -### Eventchecker APPJOINT -### This service is used to provide Eventchecker capability. it's config in db.sh same as dss-server. - -#azkaban address for check -AZKABAN_ADRESS_IP=127.0.0.1 -AZKABAN_ADRESS_PORT=8081 - -#qualitis.address for check -QUALITIS_ADRESS_IP=127.0.0.1 -QUALITIS_ADRESS_PORT=8090 - -DSS_VERSION=0.9.1 diff --git a/conf/db.sh b/conf/db.sh deleted file mode 100644 index 68b07a9b3..000000000 --- a/conf/db.sh +++ /dev/null @@ -1,8 +0,0 @@ -### for DSS-Server and Eventchecker APPJOINT -MYSQL_HOST=127.0.0.1 -MYSQL_PORT=3306 -MYSQL_DB=xxx -MYSQL_USER=xxx -MYSQL_PASSWORD=xxx - - diff --git a/conf/dss-apiservice-server.properties b/conf/dss-apiservice-server.properties new file mode 100644 index 000000000..5162f42bc --- /dev/null +++ b/conf/dss-apiservice-server.properties @@ -0,0 +1,42 @@ +# +# Copyright 2019 WeBank +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# +# + +# Spring configurations +spring.server.port=9206 +spring.spring.application.name=dss-apiservice-server + +wds.linkis.server.mybatis.mapperLocations=classpath*:com/webank/wedatasphere/dss/apiservice/core/dao/mapper/*.xml +wds.linkis.server.mybatis.typeAliasesPackage=com.webank.wedatasphere.dss.apiservice.core.bo,com.webank.wedatasphere.dss.apiservice.core.vo +wds.linkis.server.mybatis.BasePackage=com.webank.wedatasphere.dss.apiservice.core.dao + +wds.linkis.server.restful.scan.packages=com.webank.wedatasphere.dss.apiservice.core.restful + +#sit +wds.linkis.server.version=v1 +wds.linkis.server.url= + +#test +wds.linkis.test.mode=false +wds.linkis.test.user= + + +#dsm +wds.linkis.server.dsm.admin.users= + + +#用于执行的datasource配置 +wds.linkis.datasource.hikari.maximumPoolSize=100 +wds.linkis.datasource.hikari.minimumIdle=10 diff --git a/conf/dss-datapipe-server.properties b/conf/dss-datapipe-server.properties new file mode 100644 index 000000000..b3d4a8c75 --- /dev/null +++ b/conf/dss-datapipe-server.properties @@ -0,0 +1,33 @@ +# +# Copyright 2019 WeBank +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# +# + +# Spring configurations +spring.server.port=9009 +spring.spring.application.name=dss-datapipe-server + +wds.linkis.server.mybatis.mapperLocations=classpath*:com/webank/wedatasphere/dss/datapipe/dao/mapper/*.xml +wds.linkis.server.mybatis.typeAliasesPackage=com.webank.wedatasphere.dss.datapipe.vo +wds.linkis.server.mybatis.BasePackage=com.webank.wedatasphere.dss.datapipe.dao + +wds.linkis.server.restful.scan.packages=com.webank.wedatasphere.dss.datapipe.restful + +#sit +wds.linkis.server.version=v1 +wds.linkis.server.url= + +#test +wds.linkis.test.mode=false +wds.linkis.test.user= diff --git a/conf/dss-flow-execution-server.properties b/conf/dss-flow-execution-server.properties new file mode 100644 index 000000000..52b151cb3 --- /dev/null +++ b/conf/dss-flow-execution-server.properties @@ -0,0 +1,54 @@ +# +# Copyright 2019 WeBank +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# +# + +# Spring configurations +spring.server.port=9205 + +spring.spring.application.name=dss-flow-entrance + +##mybatis +wds.linkis.server.mybatis.mapperLocations=classpath*:com/webank/wedatasphere/dss/flow/execution/entrance/dao/impl/*.xml,classpath*:com/webank/wedatasphere/linkis/jobhistory/dao/impl/*.xml + +wds.linkis.server.mybatis.typeAliasesPackage= + +wds.linkis.server.mybatis.BasePackage=com.webank.wedatasphere.dss.flow.execution.entrance.dao,com.webank.wedatasphere.linkis.jobhistory.dao + + +wds.linkis.server.restful.scan.packages=com.webank.wedatasphere.linkis.entrance.restful,com.webank.wedatasphere.dss.flow.execution.entrance.restful + +#wds.linkis.server.component.exclude.classes=com.webank.wedatasphere.linkis.DataWorkCloudApplication + +wds.linkis.engine.application.name=flowExecutionEngine +wds.linkis.enginemanager.application.name=flowExecution + +wds.linkis.query.application.name=linkis-ps-publicservice + +wds.linkis.console.config.application.name=linkis-ps-publicservice +wds.linkis.engine.creation.wait.time.max=20m +wds.linkis.server.version=v1 + +wds.linkis.server.socket.mode=true + +wds.linkis.client.flow.adminuser=ws +wds.linkis.client.flow.author.user.token=WS-AUTH + +wds.linkis.server.component.exclude.classes=com.webank.wedatasphere.linkis.entranceclient.conf.ClientForEntranceSpringConfiguration,com.webank.wedatasphere.linkis.entranceclient.conf.ClientSpringConfiguration + +wds.linkis.server.component.exclude.packages=com.webank.wedatasphere.linkis.entrance.restful. +spring.spring.main.allow-bean-definition-overriding=true + +wds.linkis.entrance.config.log.path=file:///appcom/tmp/dss/ + diff --git a/conf/dss-framework-orchestrator-server.properties b/conf/dss-framework-orchestrator-server.properties new file mode 100644 index 000000000..c1f80dc56 --- /dev/null +++ b/conf/dss-framework-orchestrator-server.properties @@ -0,0 +1,42 @@ +# +# Copyright 2019 WeBank +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# +# + +# Spring configurations +spring.server.port=9203 +spring.spring.application.name=dss-framework-orchestrator-server + +wds.linkis.test.mode=true + +wds.linkis.test.user=neiljianliu + +wds.linkis.log.clear=true + +wds.linkis.server.version=v1 + +##restful +wds.linkis.server.restful.scan.packages=com.webank.wedatasphere.dss.orchestrator.server.restful + +##mybatis +wds.linkis.server.mybatis.mapperLocations=classpath*:com/webank/wedatasphere/dss/framework/appconn/dao/impl/*.xml,classpath*:com/webank/wedatasphere/dss/orchestrator/core/dao/impl/*.xml,classpath*:com/webank/wedatasphere/dss/server/dao/impl/*.xml,classpath*:com/webank/wedatasphere/dss/application/dao/impl/*.xml,classpath*:com/webank/wedatasphere/dss/workspace/mapper/impl/*.xml,classpath*:com/webank/wedatasphere/dss/workspace/common/dao/impl/*.xml,classpath*:com/webank/wedatasphere/dss/orchestrator/db/dao/impl/*.xml + +wds.linkis.server.mybatis.typeAliasesPackage=com.webank.wedatasphere.dss.server.entity,com.webank.wedatasphere.dss.application.entity,com.webank.wedatasphere.dss.framework.appconn.entity + +wds.linkis.server.mybatis.BasePackage=com.webank.wedatasphere.dss.framework.appconn.dao,com.webank.wedatasphere.dss.orchestrator.core.dao,com.webank.wedatasphere.dss.server.dao,com.webank.wedatasphere.dss.application.dao,com.webank.wedatasphere.dss.workspace.mapper,com.webank.wedatasphere.dss.workspace.common.dao,com.webank.wedatasphere.dss.workspace.common.dao,com.webank.wedatasphere.dss.orchestrator.db.dao + +wds.dss.appconn.scheduler.project.store.dir=file:///appcom/tmp/wds/scheduler +wds.dss.appconn.scheduler.azkaban.login.passwd=userpwd +##export file dir +wds.dss.server.export.url=/appcom/tmp/dss diff --git a/conf/dss-framework-project-server.properties b/conf/dss-framework-project-server.properties new file mode 100644 index 000000000..a4d2356d0 --- /dev/null +++ b/conf/dss-framework-project-server.properties @@ -0,0 +1,35 @@ +# +# Copyright 2019 WeBank +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# +# + +# Spring configurations +spring.server.port=9202 +spring.spring.application.name=dss-framework-project-server + +wds.linkis.log.clear=true + +wds.linkis.server.version=v1 + +##restful +wds.linkis.server.restful.scan.packages=com.webank.wedatasphere.dss.framework.workspace.restful,com.webank.wedatasphere.dss.framework.project.restful,com.webank.wedatasphere.dss.framework.release.restful,com.webank.wedatasphere.dss.framework.appconn.restful + +##mybatis +wds.linkis.server.mybatis.mapperLocations=classpath*:com/webank/wedatasphere/dss/framework/workspace/dao/impl/*.xml,classpath*:com/webank/wedatasphere/dss/application/dao/impl/*.xml,classpath*:com/webank/wedatasphere/dss/framework/project/dao/impl/*Mapper.xml,classpath*:com/webank/wedatasphere/dss/framework/appconn/dao/impl/*.xml,classpath*:com/webank/wedatasphere/dss/framework/release/dao/impl/*.xml + +wds.linkis.server.mybatis.typeAliasesPackage=com.webank.wedatasphere.dss.application.entity,com.webank.wedatasphere.dss.common.entity,com.webank.wedatasphere.dss.framework.workspace.bean,com.webank.wedatasphere.dss.framework.project.entity,com.webank.wedatasphere.dss.framework.appconn.entity,com.webank.wedatasphere.dss.framework.release.entity + +wds.linkis.server.mybatis.BasePackage=com.webank.wedatasphere.dss.framework.workspace.dao,com.webank.wedatasphere.dss.application.dao,com.webank.wedatasphere.dss.framework.project.dao,com.webank.wedatasphere.dss.framework.appconn.dao,com.webank.wedatasphere.dss.framework.release.dao + + diff --git a/conf/dss-workflow-server.properties b/conf/dss-workflow-server.properties new file mode 100644 index 000000000..d782c9a35 --- /dev/null +++ b/conf/dss-workflow-server.properties @@ -0,0 +1,44 @@ +# +# Copyright 2019 WeBank +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# +# + +# Spring configurations +spring.server.port=9207 +spring.spring.application.name=dss-workflow-server + +wds.linkis.test.mode=true + +wds.linkis.test.user=neiljianliu + +wds.linkis.log.clear=true + +wds.linkis.server.version=v1 + +##restful +wds.linkis.server.restful.scan.packages=com.webank.wedatasphere.dss.workflow.restful + +##mybatis +wds.linkis.server.mybatis.mapperLocations=classpath*:com/webank/wedatasphere/dss/workflow/dao/impl/*.xml,classpath*:com/webank/wedatasphere/dss/framework/appconn/dao/impl/*.xml + +wds.linkis.server.mybatis.typeAliasesPackage=com.webank.wedatasphere.dss.workflow.entity,com.webank.wedatasphere.dss.framework.appconn.entity + +wds.linkis.server.mybatis.BasePackage=com.webank.wedatasphere.dss.workflow.dao,com.webank.wedatasphere.dss.framework.appconn.dao + +wds.dss.appconn.scheduler.project.store.dir=file:///appcom/tmp/wds/scheduler +wds.dss.appconn.scheduler.azkaban.login.passwd= +##import file dir +wds.dss.file.upload.dir=/appcom/tmp/uploads + +wds.dss.server.export.env=DEV \ No newline at end of file diff --git a/conf/dss.properties b/conf/dss.properties new file mode 100644 index 000000000..e381e4054 --- /dev/null +++ b/conf/dss.properties @@ -0,0 +1,30 @@ +# +# Copyright 2019 WeBank +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# +# + +wds.linkis.gateway.ip=127.0.0.1 +wds.linkis.gateway.port=9001 +wds.linkis.gateway.url=http://127.0.0.1:9001/ +wds.linkis.gateway.wtss.url=http://127.0.0.1:9001/ + +wds.linkis.mysql.is.encrypt=false +wds.linkis.server.mybatis.datasource.url=jdbc:mysql://127.0.0.1:3306/dss_dev?characterEncoding=UTF-8 +wds.linkis.server.mybatis.datasource.username= +***REMOVED*** + +wds.dss.esb.appid= +wds.dss.esb.token= + +wds.dss.appconn.scheduler.job.label=dev \ No newline at end of file diff --git a/conf/log4j.properties b/conf/log4j.properties new file mode 100644 index 000000000..c94a1c841 --- /dev/null +++ b/conf/log4j.properties @@ -0,0 +1,36 @@ +# +# Copyright 2019 WeBank +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# +# + +### set log levels ### + +log4j.rootCategory=INFO,console + +log4j.appender.console=org.apache.log4j.ConsoleAppender +log4j.appender.console.Threshold=INFO +log4j.appender.console.layout=org.apache.log4j.PatternLayout +#log4j.appender.console.layout.ConversionPattern= %d{ISO8601} %-5p (%t) [%F:%M(%L)] - %m%n +log4j.appender.console.layout.ConversionPattern= %d{ISO8601} %-5p (%t) %p %c{1} - %m%n + + +log4j.appender.com.webank.bdp.ide.core=org.apache.log4j.DailyRollingFileAppender +log4j.appender.com.webank.bdp.ide.core.Threshold=INFO +log4j.additivity.com.webank.bdp.ide.core=false +log4j.appender.com.webank.bdp.ide.core.layout=org.apache.log4j.PatternLayout +log4j.appender.com.webank.bdp.ide.core.Append=true +log4j.appender.com.webank.bdp.ide.core.File=logs/dss-apiservice-server.log +log4j.appender.com.webank.bdp.ide.core.layout.ConversionPattern= %d{ISO8601} %-5p (%t) [%F:%M(%L)] - %m%n + +log4j.logger.org.springframework=INFO \ No newline at end of file diff --git a/conf/log4j2.xml b/conf/log4j2.xml new file mode 100644 index 000000000..8e22a9320 --- /dev/null +++ b/conf/log4j2.xml @@ -0,0 +1,38 @@ + + + + + + + + + + + + + + + + + + + + + + + diff --git a/conf/token.properties b/conf/token.properties new file mode 100644 index 000000000..5674d0245 --- /dev/null +++ b/conf/token.properties @@ -0,0 +1,17 @@ +# +# Copyright 2019 WeBank +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# +# + +${userName}=${password} \ No newline at end of file diff --git a/datachecker-appjoint/pom.xml b/datachecker-appjoint/pom.xml deleted file mode 100644 index d609c3ab3..000000000 --- a/datachecker-appjoint/pom.xml +++ /dev/null @@ -1,139 +0,0 @@ - - - - - - dss - com.webank.wedatasphere.dss - 0.9.1 - - 4.0.0 - - dss-datachecker-appjoint - - - - org.apache.commons - commons-lang3 - 3.4 - - - - com.alibaba - druid - 1.0.28 - - - - com.webank.wedatasphere.dss - dss-appjoint-core - ${dss.version} - provided - true - - - - - - - - - - - - - - - - log4j - log4j - 1.2.17 - - - - - - - - - - org.apache.maven.plugins - maven-deploy-plugin - - - - net.alchim31.maven - scala-maven-plugin - - - org.apache.maven.plugins - maven-jar-plugin - - - org.apache.maven.plugins - maven-assembly-plugin - 2.3 - false - - - make-assembly - package - - single - - - - src/main/assembly/distribution.xml - - - - - - false - dss-datachecker-appjoint - false - false - - src/main/assembly/distribution.xml - - - - - - - src/main/java - - **/*.xml - - - - src/main/resources - - **/*.properties - **/application.yml - **/bootstrap.yml - **/log4j2.xml - - - - - - - \ No newline at end of file diff --git a/datachecker-appjoint/src/main/assembly/distribution.xml b/datachecker-appjoint/src/main/assembly/distribution.xml deleted file mode 100644 index 503ab0f38..000000000 --- a/datachecker-appjoint/src/main/assembly/distribution.xml +++ /dev/null @@ -1,136 +0,0 @@ - - - - dss-datachecker-appjoint - - zip - - true - datachecker - - - - - - lib - true - true - false - true - true - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - ${basedir}/src/main/resources - - appjoint.properties - - 0777 - / - unix - - - - ${basedir}/src/main/resources - - log4j.properties - log4j2.xml - - 0777 - conf - unix - - - - - - diff --git a/datachecker-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/schedulis/jobtype/connector/DataCheckerDao.java b/datachecker-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/schedulis/jobtype/connector/DataCheckerDao.java deleted file mode 100644 index d84a0d4bc..000000000 --- a/datachecker-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/schedulis/jobtype/connector/DataCheckerDao.java +++ /dev/null @@ -1,195 +0,0 @@ -/* - * Copyright 2019 WeBank - * - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - * - */ - -package com.webank.wedatasphere.dss.appjoint.schedulis.jobtype.connector; - -import com.alibaba.druid.pool.DruidDataSource; -import com.webank.wedatasphere.dss.appjoint.schedulis.jobtype.entity.DataChecker; -import org.apache.log4j.Logger; - -import javax.sql.DataSource; -import java.sql.Connection; -import java.sql.PreparedStatement; -import java.sql.ResultSet; -import java.sql.SQLException; -import java.util.HashMap; -import java.util.List; -import java.util.Map; -import java.util.Properties; -import java.util.regex.Matcher; -import java.util.regex.Pattern; -import java.util.stream.Collectors; - -public class DataCheckerDao { - - private static final String SQL_SOURCE_TYPE_JOB_TABLE = - "SELECT * FROM DBS d JOIN TBLS t ON t.DB_ID = d.DB_ID WHERE d.NAME=? AND t.TBL_NAME=?"; - - private static final String SQL_SOURCE_TYPE_JOB_PARTITION = - "SELECT * FROM DBS d JOIN TBLS t ON t.DB_ID = d.DB_ID JOIN PARTITIONS p ON p.TBL_ID = t.TBL_ID WHERE d.NAME=? AND t.TBL_NAME=? AND p.PART_NAME=?"; - - private static DataSource jobDS; - private static DataCheckerDao instance; - - public static DataCheckerDao getInstance() { - if (instance == null) { - synchronized (DataCheckerDao.class) { - if (instance == null) { - instance = new DataCheckerDao(); - } - } - } - return instance; - } - - public boolean validateTableStatusFunction(Properties props, Logger log) { - if (jobDS == null) { - jobDS = DataDruidFactory.getJobInstance(props, log); - if (jobDS == null) { - log.error("Error getting Druid DataSource instance"); - return false; - } - } - removeBlankSpace(props); - log.info("=============================Data Check Start=========================================="); - String dataCheckerInfo = props.getProperty(DataChecker.DATA_OBJECT); - log.info("(DataChecker info) database table partition info : " + dataCheckerInfo); - long waitTime = Long.valueOf(props.getProperty(DataChecker.WAIT_TIME, "1")) * 3600 * 1000; - int queryFrequency = Integer.valueOf(props.getProperty(DataChecker.QUERY_FREQUENCY, "30000")); -// String timeScape = props.getProperty(DataChecker.TIME_SCAPE, "NULL"); - log.info("(DataChecker info) wait time : " + waitTime); - log.info("(DataChecker info) query frequency : " + queryFrequency); -// log.info("(DataChecker info) time scape : " + timeScape); - List> dataObjectList = extractProperties(props); - try (Connection conn = jobDS.getConnection()) { - boolean flag = dataObjectList - .stream() - .allMatch(proObjectMap -> { - long count = getTotalCount(proObjectMap, conn, log); - return count > 0; - }); - if (flag){ - log.info("=============================Data Check End=========================================="); - return true; - } - - } catch (SQLException e) { - throw new RuntimeException("get DataChecker result failed", e); - } - - log.info("=============================Data Check End=========================================="); - return false; - } - - private void sleep(long sleepTime) { - try { - Thread.sleep(sleepTime); - } catch (InterruptedException e) { - e.printStackTrace(); - } - } - - private void removeBlankSpace(Properties props) { - try { - props.entrySet().forEach(entry -> { - String value = entry.getValue().toString().replaceAll(" ", "").trim(); - entry.setValue(value); - }); - }catch (Exception e){ - throw new RuntimeException("remove job space char failed",e); - } - } - - private List> extractProperties(Properties p) { - return p.keySet().stream() - .map(key -> key2Map(key, p)).filter(x -> x.size() >0) - .collect(Collectors.toList()); - } - - private Map key2Map(Object key, Properties p) { - Map proMap = new HashMap<>(); - String skey = String.valueOf(key); - if(skey.contains(DataChecker.DATA_OBJECT)){ - String[] keyArr = skey.split("\\."); - if(keyArr.length == 3){ - String keyNum = keyArr[2]; - String doKey = DataChecker.DATA_OBJECT + "." + keyNum; - proMap.put(DataChecker.DATA_OBJECT, String.valueOf(p.get(doKey))); - }else{ - String doKey = DataChecker.DATA_OBJECT; - proMap.put(DataChecker.DATA_OBJECT, String.valueOf(p.get(doKey))); - } - } - return proMap; - } - - private long getTotalCount(Map proObjectMap, Connection conn, Logger log) { - String dataObject = proObjectMap.get(DataChecker.DATA_OBJECT); - if(dataObject != null) { - dataObject = dataObject.replace(" ", "").trim(); - }else{ - log.error("DataObject is null"); - return 0; - } - log.info("-------------------------------------- search hive/spark/mr data "); - log.info("-------------------------------------- : " + dataObject); - try (PreparedStatement pstmt = getStatement(conn, dataObject)) { - ResultSet rs = pstmt.executeQuery(); - return rs.last() ? rs.getRow() : 0; - } catch (SQLException e) { - log.error("fetch data from Hive MetaStore error", e); - return 0; - } - } - - private PreparedStatement getStatement(Connection conn, String dataObject) throws SQLException { - String dataScape = dataObject.contains("{") ? "Partition" : "Table"; - String[] dataObjectArray = dataObject.split("\\."); - String dbName = dataObjectArray[0]; - String tableName = dataObjectArray[1]; - if(dataScape.equals("Partition")) { - Pattern pattern = Pattern.compile("\\{([^\\}]+)\\}"); - Matcher matcher = pattern.matcher(dataObject); - String partitionName = null; - if(matcher.find()){ - partitionName = matcher.group(1); - } - partitionName = partitionName.replace("\'", "").replace("\"", ""); - tableName = tableName.split("\\{")[0]; - PreparedStatement pstmt = conn.prepareCall(SQL_SOURCE_TYPE_JOB_PARTITION); - pstmt.setString(1, dbName); - pstmt.setString(2, tableName); - pstmt.setString(3, partitionName); - return pstmt; - } else if(dataObjectArray.length == 2){ - PreparedStatement pstmt = conn.prepareCall(SQL_SOURCE_TYPE_JOB_TABLE); - pstmt.setString(1, dbName); - pstmt.setString(2, tableName); - return pstmt; - }else { - throw new SQLException("Incorrect input format for dataObject "+ dataObject); - } - } - - public static void closeDruidDataSource(){ - DruidDataSource jobDSObject = (DruidDataSource)jobDS; - if(jobDSObject != null){ - jobDSObject.close(); - } - } - -} diff --git a/datachecker-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/schedulis/jobtype/connector/DataDruidFactory.java b/datachecker-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/schedulis/jobtype/connector/DataDruidFactory.java deleted file mode 100644 index b55868125..000000000 --- a/datachecker-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/schedulis/jobtype/connector/DataDruidFactory.java +++ /dev/null @@ -1,122 +0,0 @@ -/* - * Copyright 2019 WeBank - * - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - * - */ - -package com.webank.wedatasphere.dss.appjoint.schedulis.jobtype.connector; - -import com.alibaba.druid.pool.DruidDataSource; -import org.apache.commons.lang3.StringUtils; -import org.apache.log4j.Logger; - -import java.util.Base64; -import java.util.Properties; - -public class DataDruidFactory { - private static DruidDataSource jobInstance; - private static DruidDataSource msgInstance; - - public static DruidDataSource getJobInstance(Properties props, Logger log) { - if (jobInstance == null ) { - synchronized (DataDruidFactory.class) { - if(jobInstance == null) { - try { - jobInstance = createDataSource(props, log, "Job"); - } catch (Exception e) { - throw new RuntimeException("Error creating Druid DataSource", e); - } - } - } - } - return jobInstance; - } - - public static DruidDataSource getMsgInstance(Properties props, Logger log) { - if (msgInstance == null ) { - synchronized (DataDruidFactory.class) { - if(msgInstance == null) { - try { - msgInstance = createDataSource(props, log, "Msg"); - } catch (Exception e) { - throw new RuntimeException("Error creating Druid DataSource", e); - } - } - } - } - return msgInstance; - } - - private static DruidDataSource createDataSource(Properties props, Logger log, String type) { - String name = null; - String url = null; - String username = null; - String password = null; - - if (type.equals("Job")) { - name = props.getProperty("job.datachecker.jdo.option.name"); - url = props.getProperty("job.datachecker.jdo.option.url"); - username = props.getProperty("job.datachecker.jdo.option.username"); - password = props.getProperty("job.datachecker.jdo.option.password"); - try { -// password = new String(Base64.getDecoder().decode(props.getProperty("job.datachecker.jdo.option.password").getBytes()),"UTF-8"); - password = props.getProperty("job.datachecker.jdo.option.password"); - } catch (Exception e){ - log.error("password decore failed" + e); - } - } - int initialSize = Integer.valueOf(props.getProperty("datachecker.jdo.option.initial.size", "1")); - int maxActive = Integer.valueOf(props.getProperty("datachecker.jdo.option.max.active", "100")); - int minIdle = Integer.valueOf(props.getProperty("datachecker.jdo.option.min.idle", "1")); - long maxWait = Long.valueOf(props.getProperty("datachecker.jdo.option.max.wait", "60000")); - String validationQuery = props.getProperty("datachecker.jdo.option.validation.quert", "SELECT 'x'"); - long timeBetweenEvictionRunsMillis = Long.valueOf(props.getProperty("datachecker.jdo.option.time.between.eviction.runs.millis", "6000")); - long minEvictableIdleTimeMillis = Long.valueOf(props.getProperty("datachecker.jdo.option.evictable.idle,time.millis", "300000")); - boolean testOnBorrow = Boolean.valueOf(props.getProperty("datachecker.jdo.option.test.on.borrow", "true")); - int maxOpenPreparedStatements = Integer.valueOf(props.getProperty("datachecker.jdo.option.max.open.prepared.statements", "-1")); - - - if (timeBetweenEvictionRunsMillis > minEvictableIdleTimeMillis) { - timeBetweenEvictionRunsMillis = minEvictableIdleTimeMillis; - } - - DruidDataSource ds = new DruidDataSource(); - - if (StringUtils.isNotBlank(name)) { - ds.setName(name); - } - - ds.setUrl(url); - ds.setDriverClassName("com.mysql.jdbc.Driver"); - ds.setUsername(username); - ds.setPassword(password); - ds.setInitialSize(initialSize); - ds.setMinIdle(minIdle); - ds.setMaxActive(maxActive); - ds.setMaxWait(maxWait); - ds.setTestOnBorrow(testOnBorrow); - ds.setValidationQuery(validationQuery); - ds.setTimeBetweenEvictionRunsMillis(timeBetweenEvictionRunsMillis); - ds.setMinEvictableIdleTimeMillis(minEvictableIdleTimeMillis); - if (maxOpenPreparedStatements > 0) { - ds.setPoolPreparedStatements(true); - ds.setMaxPoolPreparedStatementPerConnectionSize( - maxOpenPreparedStatements); - } else { - ds.setPoolPreparedStatements(false); - } - log.info("Druid data source initialed!"); - return ds; - } -} diff --git a/datachecker-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/schedulis/jobtype/entity/DataChecker.java b/datachecker-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/schedulis/jobtype/entity/DataChecker.java deleted file mode 100644 index b108cc6b9..000000000 --- a/datachecker-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/schedulis/jobtype/entity/DataChecker.java +++ /dev/null @@ -1,90 +0,0 @@ -/* - * Copyright 2019 WeBank - * - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - * - */ - -package com.webank.wedatasphere.dss.appjoint.schedulis.jobtype.entity; - -import com.webank.wedatasphere.dss.appjoint.execution.common.NodeExecutionState; -import com.webank.wedatasphere.dss.appjoint.schedulis.jobtype.connector.DataCheckerDao; -import com.webank.wedatasphere.dss.appjoint.schedulis.jobtype.execution.DataCheckerNodeExecutionAction; -import org.apache.log4j.Logger; - -import java.util.Properties; - -public class DataChecker { - public final static String SOURCE_TYPE = "source.type"; - public final static String DATA_OBJECT = "check.object"; - public final static String WAIT_TIME = "max.check.hours"; - public final static String QUERY_FREQUENCY = "query.frequency"; - public final static String TIME_SCAPE = "time.scape"; - - private Properties p; - private static final Logger logger = Logger.getRootLogger(); - DataCheckerDao wbDao = DataCheckerDao.getInstance(); - DataCheckerNodeExecutionAction dataCheckerAction = null; - public long maxWaitTime; - public int queryFrequency; - - public DataChecker(String jobName, Properties p,DataCheckerNodeExecutionAction action) { - this.p = p; - dataCheckerAction = action; - maxWaitTime = Long.valueOf(p.getProperty(DataChecker.WAIT_TIME, "1")) * 3600 * 1000; - queryFrequency = Integer.valueOf(p.getProperty(DataChecker.QUERY_FREQUENCY, "30000")); - - } - - public void run() { - dataCheckerAction.setState(NodeExecutionState.Running); - try { - if(p == null) { - throw new RuntimeException("Properties is null. Can't continue"); - } - if (!p.containsKey(SOURCE_TYPE)) { - logger.info("Properties " + SOURCE_TYPE + " value is Null !"); - } - if (!p.containsKey(DATA_OBJECT)) { - logger.info("Properties " + DATA_OBJECT + " value is Null !"); - } - begineCheck(); - }catch (Exception ex){ - dataCheckerAction.setState(NodeExecutionState.Failed); - throw new RuntimeException("get DataChecker result failed", ex); - } - - } - - public void begineCheck(){ - boolean success=false; - try { - success= wbDao.validateTableStatusFunction(p, logger); - }catch (Exception ex){ - dataCheckerAction.setState(NodeExecutionState.Failed); - logger.error("datacheck error",ex); - throw new RuntimeException("get DataChecker result failed", ex); - } - if(success) { - dataCheckerAction.setState(NodeExecutionState.Success); - }else { - dataCheckerAction.setState(NodeExecutionState.Running); - } - } - - public void cancel() { -// DataCheckerDao.closeDruidDataSource(); -// throw new RuntimeException("Kill this DataChecker job."); - } - -} \ No newline at end of file diff --git a/datachecker-appjoint/src/main/resources/appjoint.properties b/datachecker-appjoint/src/main/resources/appjoint.properties deleted file mode 100644 index 01e9ff0b4..000000000 --- a/datachecker-appjoint/src/main/resources/appjoint.properties +++ /dev/null @@ -1,24 +0,0 @@ -# -# Copyright 2019 WeBank -# -# Licensed under the Apache License, Version 2.0 (the "License"); -# you may not use this file except in compliance with the License. -# You may obtain a copy of the License at -# -# http://www.apache.org/licenses/LICENSE-2.0 -# -# Unless required by applicable law or agreed to in writing, software -# distributed under the License is distributed on an "AS IS" BASIS, -# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -# See the License for the specific language governing permissions and -# limitations under the License. -# -# - -job.datachecker.jdo.option.name=job -job.datachecker.jdo.option.url=jdbc:mysql://127.0.0.1:3306/ -job.datachecker.jdo.option.username= -job.datachecker.jdo.option.password= - - - diff --git a/datachecker-appjoint/src/main/resources/log4j.properties b/datachecker-appjoint/src/main/resources/log4j.properties deleted file mode 100644 index 0807e6087..000000000 --- a/datachecker-appjoint/src/main/resources/log4j.properties +++ /dev/null @@ -1,37 +0,0 @@ -# -# Copyright 2019 WeBank -# -# Licensed under the Apache License, Version 2.0 (the "License"); -# you may not use this file except in compliance with the License. -# You may obtain a copy of the License at -# -# http://www.apache.org/licenses/LICENSE-2.0 -# -# Unless required by applicable law or agreed to in writing, software -# distributed under the License is distributed on an "AS IS" BASIS, -# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -# See the License for the specific language governing permissions and -# limitations under the License. -# -# - -### set log levels ### - -log4j.rootCategory=INFO,console - -log4j.appender.console=org.apache.log4j.ConsoleAppender -log4j.appender.console.Threshold=INFO -log4j.appender.console.layout=org.apache.log4j.PatternLayout -#log4j.appender.console.layout.ConversionPattern= %d{ISO8601} %-5p (%t) [%F:%M(%L)] - %m%n -log4j.appender.console.layout.ConversionPattern= %d{ISO8601} %-5p (%t) %p %c{1} - %m%n - - -log4j.appender.com.webank.bdp.ide.core=org.apache.log4j.DailyRollingFileAppender -log4j.appender.com.webank.bdp.ide.core.Threshold=INFO -log4j.additivity.com.webank.bdp.ide.core=false -log4j.appender.com.webank.bdp.ide.core.layout=org.apache.log4j.PatternLayout -log4j.appender.com.webank.bdp.ide.core.Append=true -log4j.appender.com.webank.bdp.ide.core.File=logs/linkis.log -log4j.appender.com.webank.bdp.ide.core.layout.ConversionPattern= %d{ISO8601} %-5p (%t) [%F:%M(%L)] - %m%n - -log4j.logger.org.springframework=INFO diff --git a/datachecker-appjoint/src/main/resources/log4j2.xml b/datachecker-appjoint/src/main/resources/log4j2.xml deleted file mode 100644 index 3923cd9f3..000000000 --- a/datachecker-appjoint/src/main/resources/log4j2.xml +++ /dev/null @@ -1,39 +0,0 @@ - - - - - - - - - - - - - - - - - - - - - - - diff --git a/datachecker-appjoint/src/main/scala/com/webank/wedatasphere/dss/appjoint/schedulis/jobtype/execution/DataCheckerAppJoint.scala b/datachecker-appjoint/src/main/scala/com/webank/wedatasphere/dss/appjoint/schedulis/jobtype/execution/DataCheckerAppJoint.scala deleted file mode 100644 index f71b8307d..000000000 --- a/datachecker-appjoint/src/main/scala/com/webank/wedatasphere/dss/appjoint/schedulis/jobtype/execution/DataCheckerAppJoint.scala +++ /dev/null @@ -1,51 +0,0 @@ -/* - * Copyright 2019 WeBank - * - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - * - */ - -package com.webank.wedatasphere.dss.appjoint.schedulis.jobtype.execution - -import java.util - -import com.webank.wedatasphere.dss.appjoint.AppJoint -import com.webank.wedatasphere.dss.appjoint.execution.NodeExecution -import com.webank.wedatasphere.dss.appjoint.service.AppJointUrlImpl - -/** - * Created by enjoyyin on 2019/11/5. - */ -class DataCheckerAppJoint extends AppJointUrlImpl with AppJoint { - - private var params: util.Map[String, AnyRef] = _ - private var nodeExecution: NodeExecution = _ - - override def getAppJointName: String = "DataChecker" - - override def init(baseUrl: String, params: util.Map[String, AnyRef]): Unit = { - setBaseUrl(baseUrl) - this.params = params - } - - override def getNodeExecution: NodeExecution = { - if(nodeExecution == null) synchronized { - if(nodeExecution == null) { - nodeExecution = new DataCheckerExecution() - nodeExecution.setBaseUrl(getBaseUrl) - nodeExecution.init(params) - } - } - nodeExecution - } -} diff --git a/datachecker-appjoint/src/main/scala/com/webank/wedatasphere/dss/appjoint/schedulis/jobtype/execution/DataCheckerExecution.scala b/datachecker-appjoint/src/main/scala/com/webank/wedatasphere/dss/appjoint/schedulis/jobtype/execution/DataCheckerExecution.scala deleted file mode 100644 index c169d4431..000000000 --- a/datachecker-appjoint/src/main/scala/com/webank/wedatasphere/dss/appjoint/schedulis/jobtype/execution/DataCheckerExecution.scala +++ /dev/null @@ -1,157 +0,0 @@ -/* - * Copyright 2019 WeBank - * - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - * - */ - -package com.webank.wedatasphere.dss.appjoint.schedulis.jobtype.execution - -import java.util -import java.util.{Properties, UUID} - -import com.webank.wedatasphere.dss.appjoint.execution.common._ -import com.webank.wedatasphere.dss.appjoint.execution.core._ -import com.webank.wedatasphere.dss.appjoint.schedulis.jobtype.entity.DataChecker -import com.webank.wedatasphere.dss.appjoint.service.AppJointUrl -import com.webank.wedatasphere.dss.appjoint.service.session.Session -import com.webank.wedatasphere.linkis.common.utils.Utils -import org.slf4j.LoggerFactory; - -/** - * Created by allenlliu on 2019/11/11. - */ -class DataCheckerExecution extends LongTermNodeExecution with AppJointUrl with Killable with Procedure { - private val logger = LoggerFactory.getLogger(classOf[DataCheckerExecution]) - import scala.collection.JavaConversions.mapAsScalaMap - var appJointParams : scala.collection.mutable.Map[String,AnyRef]= null - - - /** - * 表示任务能否提交到该AppJoint去执行 - * - * @param node AppJointNode - * @return true is ok while false is not - */ - override def canExecute(node: AppJointNode, context: NodeContext, session: Session): Boolean = node.getNodeType.toLowerCase.contains("datachecker") - - protected def putErrorMsg(errorMsg: String, t: Throwable, action: DataCheckerNodeExecutionAction): DataCheckerNodeExecutionAction = t match { - - case t: Exception => - val response = action.response - response.setErrorMsg(errorMsg) - response.setException(t) - response.setIsSucceed(false) - action - } - - override def init(params: util.Map[String, AnyRef]): Unit = { - this.appJointParams = params - } - override def submit(node: AppJointNode, context: NodeContext, session:Session): NodeExecutionAction = { - val nodeAction = new DataCheckerNodeExecutionAction() - nodeAction.setId(UUID.randomUUID().toString()) - val jobName = node.getName - val scalaParams: scala.collection.mutable.Map[String,Object] =context.getRuntimeMap - val properties = new Properties() - this.appJointParams.foreach{ - case (key: String, value: Object) => - logger.info("appjoint params key : "+key+",value : "+value) - properties.put(key, value.toString) - } - scalaParams.foreach { case (key: String, value: Object) => - logger.info("request params key : "+key+",value : "+value) - properties.put(key, value.toString) - } - val dc = new DataChecker(jobName,properties,nodeAction) - dc.run() - nodeAction.setDc(dc) - nodeAction - } - - override def state(action: NodeExecutionAction): NodeExecutionState = { - action match { - case action: DataCheckerNodeExecutionAction => { - if (action.state.isCompleted) return action.state - action.dc.begineCheck() - action.state - } - case _ => NodeExecutionState.Failed - } - } - private var baseUrl:String ="" - - override def getBaseUrl: String = baseUrl - - override def setBaseUrl(basicUrl: String): Unit = { - this.baseUrl = basicUrl - } - - override def result(action: NodeExecutionAction, nodeContext: NodeContext): CompletedNodeExecutionResponse = { - val response = new CompletedNodeExecutionResponse - action match { - case action: DataCheckerNodeExecutionAction => { - if (action.state.equals(NodeExecutionState.Success)) { - response.setIsSucceed(true) - } else { - response.setIsSucceed(false) - } - response - } - case _ => { - response.setIsSucceed(false) - response - } - - - } - } - - override def kill(action: NodeExecutionAction): Boolean = action match { - case longTermAction: DataCheckerNodeExecutionAction => - getScheduler.removeAsyncResponse(longTermAction) - true - } - - override def progress(action: NodeExecutionAction): Float = { - return 0.5f - } - - override def log(action: NodeExecutionAction): String = { - action match { - case action: DataCheckerNodeExecutionAction => { - if (!action.state.isCompleted) { - "DataChecker is waiting for tables" - } else { - "DataChecker successfully received info of tables" - } - } - case _ => "Error for NodeExecutionAction " - } - - } - - override def createAsyncNodeExecutionResponse(node: AppJointNode, context: NodeContext, action: NodeExecutionAction): AsyncNodeExecutionResponse = { - action match { - case action: DataCheckerNodeExecutionAction => { - val response = new AsyncNodeExecutionResponse - response.setAction(action) - response.setAppJointNode(node) - response.setNodeContext(context) - response.setMaxLoopTime(action.dc.maxWaitTime) - response.setAskStatePeriod(action.dc.queryFrequency) - response - } - } - } -} diff --git a/datachecker-appjoint/src/main/scala/com/webank/wedatasphere/dss/appjoint/schedulis/jobtype/execution/DataCheckerNodeExecutionAction.scala b/datachecker-appjoint/src/main/scala/com/webank/wedatasphere/dss/appjoint/schedulis/jobtype/execution/DataCheckerNodeExecutionAction.scala deleted file mode 100644 index fff35e067..000000000 --- a/datachecker-appjoint/src/main/scala/com/webank/wedatasphere/dss/appjoint/schedulis/jobtype/execution/DataCheckerNodeExecutionAction.scala +++ /dev/null @@ -1,46 +0,0 @@ -/* - * Copyright 2019 WeBank - * - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - * - */ - -package com.webank.wedatasphere.dss.appjoint.schedulis.jobtype.execution - -import com.webank.wedatasphere.dss.appjoint.execution.common.{AbstractNodeExecutionAction, CompletedNodeExecutionResponse, LongTermNodeExecutionAction, NodeExecutionState} -import com.webank.wedatasphere.dss.appjoint.schedulis.jobtype.entity.DataChecker - -/** - * Created by allenlliu on 2019/11/12. - */ -class DataCheckerNodeExecutionAction extends AbstractNodeExecutionAction with LongTermNodeExecutionAction { - private[this] var _state: NodeExecutionState = null - private var schedulerId: Int = _ - def state: NodeExecutionState = _state - - def setState(value: NodeExecutionState): Unit = { - _state = value - } - val response = new CompletedNodeExecutionResponse - private[this] var _dc: DataChecker = null - - def dc: DataChecker = _dc - - def setDc(value: DataChecker): Unit = { - _dc = value - } - - override def setSchedulerId(schedulerId: Int): Unit = this.schedulerId = schedulerId - - override def getSchedulerId: Int = schedulerId -} diff --git a/db/azkaban.sql b/db/azkaban.sql deleted file mode 100644 index 7f18b3308..000000000 --- a/db/azkaban.sql +++ /dev/null @@ -1,4 +0,0 @@ -INSERT INTO `dss_application` (`id`, `name`, `url`, `is_user_need_init`, `level`, `user_init_url`, `exists_project_service`, `project_url`, `enhance_json`, `if_iframe`, `homepage_url`, `redirect_url`) VALUES (NULL, 'schedulis', NULL, '0', '1', NULL, '0', NULL, NULL, '1', NULL, NULL); -UPDATE `dss_application` SET url = 'http://AZKABAN_ADRESS_IP_2:AZKABAN_ADRESS_PORT', project_url = 'http://AZKABAN_ADRESS_IP_2:AZKABAN_ADRESS_PORT/manager?project=${projectName}',homepage_url = 'http://AZKABAN_ADRESS_IP_2:AZKABAN_ADRESS_PORT/homepage' WHERE `name` in ('schedulis'); -SELECT @shcedulis_id:=id FROM `dss_application` WHERE `name` = 'schedulis'; -insert into dss_workflow_node values(null,null,'linkis.shell.sh',@shcedulis_id,1,1,0,1,null); diff --git a/db/davinci.sql b/db/davinci.sql deleted file mode 100644 index 29ed90a7a..000000000 --- a/db/davinci.sql +++ /dev/null @@ -1,666 +0,0 @@ -SET NAMES utf8mb4; -SET FOREIGN_KEY_CHECKS = 0; - --- ---------------------------- --- Table structure for cron_job --- ---------------------------- -DROP TABLE IF EXISTS `cron_job`; -CREATE TABLE `cron_job` ( - `id` bigint(20) NOT NULL AUTO_INCREMENT, - `name` varchar(45) COLLATE utf8_unicode_ci NOT NULL, - `project_id` bigint(20) NOT NULL, - `job_type` varchar(45) COLLATE utf8_unicode_ci NOT NULL, - `job_status` varchar(10) COLLATE utf8_unicode_ci NOT NULL DEFAULT '', - `cron_expression` varchar(45) COLLATE utf8_unicode_ci NOT NULL, - `start_date` datetime NOT NULL, - `end_date` datetime NOT NULL, - `config` text COLLATE utf8_unicode_ci NOT NULL, - `description` varchar(255) COLLATE utf8_unicode_ci DEFAULT NULL, - `exec_log` text COLLATE utf8_unicode_ci, - `create_by` bigint(20) NOT NULL, - `create_time` timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP, - `update_time` timestamp NULL DEFAULT NULL, - PRIMARY KEY (`id`) USING BTREE, - UNIQUE KEY `name_UNIQUE` (`name`) USING BTREE -) ENGINE=InnoDB AUTO_INCREMENT=13 DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci; - --- ---------------------------- --- Table structure for dashboard --- ---------------------------- -DROP TABLE IF EXISTS `dashboard`; -CREATE TABLE `dashboard` -( - `id` bigint(20) NOT NULL AUTO_INCREMENT, - `name` varchar(255) NOT NULL, - `dashboard_portal_id` bigint(20) NOT NULL, - `type` smallint(1) NOT NULL, - `index` int(4) NOT NULL, - `parent_id` bigint(20) NOT NULL DEFAULT '0', - `config` text, - `full_parent_Id` varchar(100) DEFAULT NULL, - `create_by` bigint(20) DEFAULT NULL, - `create_time` datetime DEFAULT NULL, - `update_by` bigint(20) DEFAULT NULL, - `update_time` datetime DEFAULT NULL, - PRIMARY KEY (`id`) USING BTREE, - KEY `idx_dashboard_id` (`dashboard_portal_id`) USING BTREE, - KEY `idx_parent_id` (`parent_id`) USING BTREE -) ENGINE = InnoDB - DEFAULT CHARSET = utf8; - --- ---------------------------- --- Table structure for dashboard_portal --- ---------------------------- -DROP TABLE IF EXISTS `dashboard_portal`; -CREATE TABLE `dashboard_portal` -( - `id` bigint(20) NOT NULL AUTO_INCREMENT, - `name` varchar(255) NOT NULL, - `description` varchar(255) DEFAULT NULL, - `project_id` bigint(20) NOT NULL, - `avatar` varchar(255) DEFAULT NULL, - `publish` tinyint(1) NOT NULL DEFAULT '0', - `create_by` bigint(20) DEFAULT NULL, - `create_time` datetime DEFAULT NULL, - `update_by` bigint(20) DEFAULT NULL, - `update_time` datetime DEFAULT NULL, - PRIMARY KEY (`id`) USING BTREE, - KEY `idx_project_id` (`project_id`) USING BTREE -) ENGINE = InnoDB - DEFAULT CHARSET = utf8; - --- ---------------------------- --- Table structure for display --- ---------------------------- -DROP TABLE IF EXISTS `display`; -CREATE TABLE `display` -( - `id` bigint(20) NOT NULL AUTO_INCREMENT, - `name` varchar(255) NOT NULL, - `description` varchar(255) DEFAULT NULL, - `project_id` bigint(20) NOT NULL, - `avatar` varchar(255) DEFAULT NULL, - `publish` tinyint(1) NOT NULL, - `create_by` bigint(20) DEFAULT NULL, - `create_time` datetime DEFAULT NULL, - `update_by` bigint(20) DEFAULT NULL, - `update_time` datetime DEFAULT NULL, - PRIMARY KEY (`id`) USING BTREE, - KEY `idx_project_id` (`project_id`) USING BTREE -) ENGINE = InnoDB - DEFAULT CHARSET = utf8; - --- ---------------------------- --- Table structure for display_slide --- ---------------------------- -DROP TABLE IF EXISTS `display_slide`; -CREATE TABLE `display_slide` -( - `id` bigint(20) NOT NULL AUTO_INCREMENT, - `display_id` bigint(20) NOT NULL, - `index` int(12) NOT NULL, - `config` text NOT NULL, - `create_by` bigint(20) DEFAULT NULL, - `create_time` datetime DEFAULT NULL, - `update_by` bigint(20) DEFAULT NULL, - `update_time` datetime DEFAULT NULL, - PRIMARY KEY (`id`) USING BTREE, - KEY `idx_display_id` (`display_id`) USING BTREE -) ENGINE = InnoDB - DEFAULT CHARSET = utf8; - --- ---------------------------- --- Table structure for download_record --- ---------------------------- -DROP TABLE IF EXISTS `download_record`; -CREATE TABLE `download_record` -( - `id` bigint(20) NOT NULL AUTO_INCREMENT, - `name` varchar(255) NOT NULL, - `user_id` bigint(20) NOT NULL, - `path` varchar(255) DEFAULT NULL, - `status` smallint(1) NOT NULL, - `create_time` datetime NOT NULL, - `last_download_time` datetime DEFAULT NULL, - PRIMARY KEY (`id`) USING BTREE, - KEY `idx_user` (`user_id`) USING BTREE -) ENGINE = InnoDB - DEFAULT CHARSET = utf8mb4; - --- ---------------------------- --- Table structure for favorite --- ---------------------------- -DROP TABLE IF EXISTS `favorite`; -CREATE TABLE `favorite` -( - `id` bigint(20) NOT NULL AUTO_INCREMENT, - `user_id` bigint(20) NOT NULL, - `project_id` bigint(20) NOT NULL, - `create_time` datetime NOT NULL ON UPDATE CURRENT_TIMESTAMP, - PRIMARY KEY (`id`) USING BTREE, - UNIQUE KEY `idx_user_project` (`user_id`, `project_id`) USING BTREE -) ENGINE = InnoDB - DEFAULT CHARSET = utf8; - --- ---------------------------- --- Table structure for mem_dashboard_widget --- ---------------------------- -DROP TABLE IF EXISTS `mem_dashboard_widget`; -CREATE TABLE `mem_dashboard_widget` -( - `id` bigint(20) NOT NULL AUTO_INCREMENT, - `dashboard_id` bigint(20) NOT NULL, - `widget_Id` bigint(20) DEFAULT NULL, - `x` int(12) NOT NULL, - `y` int(12) NOT NULL, - `width` int(12) NOT NULL, - `height` int(12) NOT NULL, - `polling` tinyint(1) NOT NULL DEFAULT '0', - `frequency` int(12) DEFAULT NULL, - `config` text, - `create_by` bigint(20) DEFAULT NULL, - `create_time` datetime DEFAULT NULL, - `update_by` bigint(20) DEFAULT NULL, - `update_time` datetime DEFAULT NULL, - PRIMARY KEY (`id`) USING BTREE, - KEY `idx_protal_id` (`dashboard_id`) USING BTREE, - KEY `idx_widget_id` (`widget_Id`) USING BTREE -) ENGINE = InnoDB - DEFAULT CHARSET = utf8; - --- ---------------------------- --- Table structure for mem_display_slide_widget --- ---------------------------- -DROP TABLE IF EXISTS `mem_display_slide_widget`; -CREATE TABLE `mem_display_slide_widget` -( - `id` bigint(20) NOT NULL AUTO_INCREMENT, - `display_slide_id` bigint(20) NOT NULL, - `widget_id` bigint(20) DEFAULT NULL, - `name` varchar(255) NOT NULL, - `params` text NOT NULL, - `type` smallint(1) NOT NULL, - `sub_type` smallint(2) DEFAULT NULL, - `index` int(12) NOT NULL DEFAULT '0', - `create_by` bigint(20) DEFAULT NULL, - `create_time` datetime DEFAULT NULL, - `update_by` bigint(20) DEFAULT NULL, - `update_time` datetime DEFAULT NULL, - PRIMARY KEY (`id`) USING BTREE, - KEY `idx_slide_id` (`display_slide_id`) USING BTREE, - KEY `idx_widget_id` (`widget_id`) USING BTREE -) ENGINE = InnoDB - DEFAULT CHARSET = utf8; - --- ---------------------------- --- Table structure for organization --- ---------------------------- -DROP TABLE IF EXISTS `organization`; -CREATE TABLE `organization` -( - `id` bigint(20) NOT NULL AUTO_INCREMENT, - `name` varchar(255) NOT NULL, - `description` varchar(255) DEFAULT NULL, - `avatar` varchar(255) DEFAULT NULL, - `user_id` bigint(20) NOT NULL, - `project_num` int(20) DEFAULT '0', - `member_num` int(20) DEFAULT '0', - `role_num` int(20) DEFAULT '0', - `allow_create_project` tinyint(1) DEFAULT '1', - `member_permission` smallint(1) NOT NULL DEFAULT '0', - `create_time` timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP, - `create_by` bigint(20) NOT NULL DEFAULT '0', - `update_time` timestamp NULL DEFAULT NULL, - `update_by` bigint(20) DEFAULT '0', - PRIMARY KEY (`id`) USING BTREE -) ENGINE = InnoDB - DEFAULT CHARSET = utf8; - --- ---------------------------- --- Table structure for platform --- ---------------------------- -DROP TABLE IF EXISTS `platform`; -CREATE TABLE `platform` -( - `id` bigint(20) NOT NULL, - `name` varchar(255) NOT NULL, - `platform` varchar(255) NOT NULL, - `code` varchar(32) NOT NULL, - `checkCode` varchar(255) DEFAULT NULL, - `checkSystemToken` varchar(255) DEFAULT NULL, - `checkUrl` varchar(255) DEFAULT NULL, - `alternateField1` varchar(255) DEFAULT NULL, - `alternateField2` varchar(255) DEFAULT NULL, - `alternateField3` varchar(255) DEFAULT NULL, - `alternateField4` varchar(255) DEFAULT NULL, - `alternateField5` varchar(255) DEFAULT NULL, - PRIMARY KEY (`id`) USING BTREE -) ENGINE = InnoDB - DEFAULT CHARSET = utf8; - --- ---------------------------- --- Table structure for project --- ---------------------------- -DROP TABLE IF EXISTS `project`; -CREATE TABLE `project` -( - `id` bigint(20) NOT NULL AUTO_INCREMENT, - `name` varchar(255) NOT NULL, - `description` varchar(255) DEFAULT NULL, - `pic` varchar(255) DEFAULT NULL, - `org_id` bigint(20) NOT NULL, - `user_id` bigint(20) NOT NULL, - `visibility` tinyint(1) DEFAULT '1', - `star_num` int(11) DEFAULT '0', - `is_transfer` tinyint(1) NOT NULL DEFAULT '0', - `initial_org_id` bigint(20) NOT NULL, - `create_by` bigint(20) DEFAULT NULL, - `create_time` datetime DEFAULT NULL, - `update_by` bigint(20) DEFAULT NULL, - `update_time` datetime DEFAULT NULL, - PRIMARY KEY (`id`) USING BTREE -) ENGINE = InnoDB - DEFAULT CHARSET = utf8; - --- ---------------------------- --- Table structure for rel_project_admin --- ---------------------------- -DROP TABLE IF EXISTS `rel_project_admin`; -CREATE TABLE `rel_project_admin` -( - `id` bigint(20) NOT NULL AUTO_INCREMENT, - `project_id` bigint(20) NOT NULL, - `user_id` bigint(20) NOT NULL, - `create_by` bigint(20) DEFAULT NULL, - `create_time` datetime DEFAULT NULL, - `update_by` bigint(20) DEFAULT NULL, - `update_time` datetime DEFAULT NULL, - PRIMARY KEY (`id`) USING BTREE, - UNIQUE KEY `idx_project_user` (`project_id`, `user_id`) USING BTREE -) ENGINE = InnoDB - DEFAULT CHARSET = utf8mb4 COMMENT ='project admin表'; - --- ---------------------------- --- Table structure for rel_role_dashboard --- ---------------------------- -DROP TABLE IF EXISTS `rel_role_dashboard`; -CREATE TABLE `rel_role_dashboard` -( - `role_id` bigint(20) NOT NULL, - `dashboard_id` bigint(20) NOT NULL, - `visible` tinyint(1) NOT NULL DEFAULT '0', - `create_by` bigint(20) DEFAULT NULL, - `create_time` datetime DEFAULT NULL, - `update_by` bigint(20) DEFAULT NULL, - `update_time` datetime DEFAULT NULL, - PRIMARY KEY (`role_id`, `dashboard_id`) USING BTREE -) ENGINE = InnoDB - DEFAULT CHARSET = utf8mb4; - --- ---------------------------- --- Table structure for rel_role_display --- ---------------------------- -DROP TABLE IF EXISTS `rel_role_display`; -CREATE TABLE `rel_role_display` -( - `role_id` bigint(20) NOT NULL, - `display_id` bigint(20) NOT NULL, - `visible` tinyint(1) NOT NULL DEFAULT '0', - `create_by` bigint(20) DEFAULT NULL, - `create_time` datetime DEFAULT NULL, - `update_by` bigint(20) DEFAULT NULL, - `update_time` datetime DEFAULT NULL, - PRIMARY KEY (`role_id`, `display_id`) USING BTREE -) ENGINE = InnoDB - DEFAULT CHARSET = utf8mb4; - --- ---------------------------- --- Table structure for rel_role_portal --- ---------------------------- -DROP TABLE IF EXISTS `rel_role_portal`; -CREATE TABLE `rel_role_portal` -( - `role_id` bigint(20) NOT NULL, - `portal_id` bigint(20) NOT NULL, - `visible` tinyint(1) NOT NULL DEFAULT '0', - `create_by` bigint(20) DEFAULT NULL, - `create_time` datetime DEFAULT NULL, - `update_by` bigint(20) DEFAULT NULL, - `update_time` datetime DEFAULT NULL, - PRIMARY KEY (`role_id`, `portal_id`) USING BTREE -) ENGINE = InnoDB - DEFAULT CHARSET = utf8mb4; - --- ---------------------------- --- Table structure for rel_role_project --- ---------------------------- -DROP TABLE IF EXISTS `rel_role_project`; -CREATE TABLE `rel_role_project` -( - `id` bigint(20) NOT NULL AUTO_INCREMENT, - `project_id` bigint(20) NOT NULL, - `role_id` bigint(20) NOT NULL, - `source_permission` smallint(1) NOT NULL DEFAULT '1', - `view_permission` smallint(1) NOT NULL DEFAULT '1', - `widget_permission` smallint(1) NOT NULL DEFAULT '1', - `viz_permission` smallint(1) NOT NULL DEFAULT '1', - `schedule_permission` smallint(1) NOT NULL DEFAULT '1', - `share_permission` tinyint(1) NOT NULL DEFAULT '0', - `download_permission` tinyint(1) NOT NULL DEFAULT '0', - `create_by` bigint(20) DEFAULT NULL, - `create_time` datetime DEFAULT NULL, - `update_by` bigint(20) DEFAULT NULL, - `update_time` datetime DEFAULT NULL, - PRIMARY KEY (`id`) USING BTREE, - UNIQUE KEY `idx_role_project` (`project_id`, `role_id`) USING BTREE -) ENGINE = InnoDB - DEFAULT CHARSET = utf8mb4; - --- ---------------------------- --- Table structure for rel_role_slide --- ---------------------------- -DROP TABLE IF EXISTS `rel_role_slide`; -CREATE TABLE `rel_role_slide` -( - `role_id` bigint(20) NOT NULL, - `slide_id` bigint(20) NOT NULL, - `visible` tinyint(1) NOT NULL DEFAULT '0', - `create_by` bigint(20) DEFAULT NULL, - `create_time` datetime DEFAULT NULL, - `update_by` bigint(20) DEFAULT NULL, - `update_time` datetime DEFAULT NULL, - PRIMARY KEY (`role_id`, `slide_id`) USING BTREE -) ENGINE = InnoDB - DEFAULT CHARSET = utf8mb4; - --- ---------------------------- --- Table structure for rel_role_user --- ---------------------------- -DROP TABLE IF EXISTS `rel_role_user`; -CREATE TABLE `rel_role_user` -( - `id` bigint(20) NOT NULL AUTO_INCREMENT, - `user_id` bigint(20) NOT NULL, - `role_id` bigint(20) NOT NULL, - `create_by` bigint(20) DEFAULT NULL, - `create_time` datetime DEFAULT NULL, - `update_by` bigint(20) DEFAULT NULL, - `update_time` datetime DEFAULT NULL, - PRIMARY KEY (`id`) USING BTREE, - UNIQUE KEY `idx_role_user` (`user_id`, `role_id`) USING BTREE -) ENGINE = InnoDB - DEFAULT CHARSET = utf8mb4; - --- ---------------------------- --- Table structure for rel_role_view --- ---------------------------- -DROP TABLE IF EXISTS `rel_role_view`; -CREATE TABLE `rel_role_view` -( - `view_id` bigint(20) NOT NULL, - `role_id` bigint(20) NOT NULL, - `row_auth` text, - `column_auth` text, - `create_by` bigint(20) DEFAULT NULL, - `create_time` datetime DEFAULT NULL, - `update_by` bigint(20) DEFAULT NULL, - `update_time` datetime DEFAULT NULL, - PRIMARY KEY (`view_id`, `role_id`) USING BTREE -) ENGINE = InnoDB - DEFAULT CHARSET = utf8; - --- ---------------------------- --- Table structure for rel_user_organization --- ---------------------------- -DROP TABLE IF EXISTS `rel_user_organization`; -CREATE TABLE `rel_user_organization` -( - `id` bigint(20) NOT NULL AUTO_INCREMENT, - `org_id` bigint(20) NOT NULL, - `user_id` bigint(20) NOT NULL, - `role` smallint(1) NOT NULL DEFAULT '0', - PRIMARY KEY (`id`) USING BTREE, - UNIQUE KEY `idx_org_user` (`org_id`, `user_id`) USING BTREE -) ENGINE = InnoDB - DEFAULT CHARSET = utf8; - --- ---------------------------- --- Table structure for role --- ---------------------------- -DROP TABLE IF EXISTS `role`; -CREATE TABLE `role` -( - `id` bigint(20) NOT NULL AUTO_INCREMENT, - `org_id` bigint(20) NOT NULL, - `name` varchar(100) NOT NULL, - `description` varchar(255) DEFAULT NULL, - `create_by` bigint(20) DEFAULT NULL, - `create_time` datetime DEFAULT NULL, - `update_by` bigint(20) DEFAULT NULL, - `update_time` datetime DEFAULT NULL, - `avatar` varchar(255) DEFAULT NULL, - PRIMARY KEY (`id`) USING BTREE, - KEY `idx_orgid` (`org_id`) USING BTREE -) ENGINE = InnoDB - DEFAULT CHARSET = utf8mb4 COMMENT ='权限表'; - --- ---------------------------- --- Table structure for source --- ---------------------------- -DROP TABLE IF EXISTS `source`; -CREATE TABLE `source` -( - `id` bigint(20) NOT NULL AUTO_INCREMENT, - `name` varchar(255) NOT NULL, - `description` varchar(255) DEFAULT NULL, - `config` text NOT NULL, - `type` varchar(10) NOT NULL, - `project_id` bigint(20) NOT NULL, - `create_by` bigint(20) DEFAULT NULL, - `create_time` datetime DEFAULT NULL, - `update_by` bigint(20) DEFAULT NULL, - `update_time` datetime DEFAULT NULL, - `parent_id` bigint(20) DEFAULT NULL, - `full_parent_id` varchar(255) DEFAULT NULL, - `is_folder` tinyint(1) DEFAULT NULL, - `index` int(5) DEFAULT NULL, - PRIMARY KEY (`id`) USING BTREE, - KEY `idx_project_id` (`project_id`) USING BTREE -) ENGINE = InnoDB - DEFAULT CHARSET = utf8; - --- ---------------------------- --- Table structure for star --- ---------------------------- -DROP TABLE IF EXISTS `star`; -CREATE TABLE `star` -( - `id` bigint(20) NOT NULL AUTO_INCREMENT, - `target` varchar(20) NOT NULL, - `target_id` bigint(20) NOT NULL, - `user_id` bigint(20) NOT NULL, - `star_time` datetime NOT NULL ON UPDATE CURRENT_TIMESTAMP, - PRIMARY KEY (`id`) USING BTREE, - KEY `idx_target_id` (`target_id`) USING BTREE, - KEY `idx_user_id` (`user_id`) USING BTREE -) ENGINE = InnoDB - DEFAULT CHARSET = utf8; - --- ---------------------------- --- Table structure for user --- ---------------------------- -DROP TABLE IF EXISTS `user`; -CREATE TABLE `user` -( - `id` bigint(20) NOT NULL AUTO_INCREMENT, - `email` varchar(255) NOT NULL, - `username` varchar(255) NOT NULL, - `password` varchar(255) NOT NULL, - `admin` tinyint(1) NOT NULL, - `active` tinyint(1) DEFAULT NULL, - `name` varchar(255) DEFAULT NULL, - `description` varchar(255) DEFAULT NULL, - `department` varchar(255) DEFAULT NULL, - `avatar` varchar(255) DEFAULT NULL, - `create_time` timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP, - `create_by` bigint(20) NOT NULL DEFAULT '0', - `update_time` timestamp NOT NULL DEFAULT '1970-01-01 08:00:01', - `update_by` bigint(20) NOT NULL DEFAULT '0', - PRIMARY KEY (`id`) USING BTREE -) ENGINE = InnoDB - DEFAULT CHARSET = utf8; - --- ---------------------------- --- Table structure for view --- ---------------------------- -DROP TABLE IF EXISTS `view`; -CREATE TABLE `view` -( - `id` bigint(20) NOT NULL AUTO_INCREMENT, - `name` varchar(255) NOT NULL, - `description` varchar(255) DEFAULT NULL, - `project_id` bigint(20) NOT NULL, - `source_id` bigint(20) NOT NULL, - `sql` text, - `model` text, - `variable` text, - `config` text, - `create_by` bigint(20) DEFAULT NULL, - `create_time` datetime DEFAULT NULL, - `update_by` bigint(20) DEFAULT NULL, - `update_time` datetime DEFAULT NULL, - `parent_id` bigint(20) DEFAULT NULL, - `full_parent_id` varchar(255) DEFAULT NULL, - `is_folder` tinyint(1) DEFAULT NULL, - `index` int(5) DEFAULT NULL, - PRIMARY KEY (`id`) USING BTREE, - KEY `idx_project_id` (`project_id`) USING BTREE -) ENGINE = InnoDB - DEFAULT CHARSET = utf8; - --- ---------------------------- --- Table structure for widget --- ---------------------------- -DROP TABLE IF EXISTS `widget`; -CREATE TABLE `widget` -( - `id` bigint(20) NOT NULL AUTO_INCREMENT, - `name` varchar(255) NOT NULL, - `description` varchar(255) DEFAULT NULL, - `view_id` bigint(20) NOT NULL, - `project_id` bigint(20) NOT NULL, - `type` bigint(20) NOT NULL, - `publish` tinyint(1) NOT NULL, - `config` longtext NOT NULL, - `create_by` bigint(20) DEFAULT NULL, - `create_time` datetime DEFAULT NULL, - `update_by` bigint(20) DEFAULT NULL, - `update_time` datetime DEFAULT NULL, - `parent_id` bigint(20) DEFAULT NULL, - `full_parent_id` varchar(255) DEFAULT NULL, - `is_folder` tinyint(1) DEFAULT NULL, - `index` int(5) DEFAULT NULL, - PRIMARY KEY (`id`) USING BTREE, - KEY `idx_project_id` (`project_id`) USING BTREE, - KEY `idx_view_id` (`view_id`) USING BTREE -) ENGINE = InnoDB - DEFAULT CHARSET = utf8; - - -DROP TABLE IF EXISTS `rel_role_display_slide_widget`; -CREATE TABLE `rel_role_display_slide_widget` -( - `role_id` bigint(20) NOT NULL, - `mem_display_slide_widget_id` bigint(20) NOT NULL, - `visible` tinyint(1) NOT NULL DEFAULT '0', - `create_by` bigint(20) DEFAULT NULL, - `create_time` datetime DEFAULT NULL, - `update_by` bigint(20) DEFAULT NULL, - `update_time` datetime DEFAULT NULL, - PRIMARY KEY (`role_id`, `mem_display_slide_widget_id`) USING BTREE -) ENGINE = InnoDB - DEFAULT CHARSET = utf8mb4; - - -DROP TABLE IF EXISTS `rel_role_dashboard_widget`; -CREATE TABLE `rel_role_dashboard_widget` -( - `role_id` bigint(20) NOT NULL, - `mem_dashboard_widget_id` bigint(20) NOT NULL, - `visible` tinyint(1) NOT NULL DEFAULT '0', - `create_by` bigint(20) DEFAULT NULL, - `create_time` datetime DEFAULT NULL, - `update_by` bigint(20) DEFAULT NULL, - `update_time` datetime DEFAULT NULL, - PRIMARY KEY (`role_id`, `mem_dashboard_widget_id`) USING BTREE -) ENGINE = InnoDB - DEFAULT CHARSET = utf8mb4; - -DROP TABLE IF EXISTS `davinci_statistic_visitor_operation`; -CREATE TABLE `davinci_statistic_visitor_operation` ( - `id` bigint(20) NOT NULL AUTO_INCREMENT, - `user_id` bigint(20) DEFAULT NULL, - `email` varchar(255) DEFAULT NULL, - `action` varchar(255) DEFAULT NULL COMMENT 'login/visit/initial/sync/search/linkage/drill/download/print', - `org_id` bigint(20) DEFAULT NULL, - `project_id` bigint(20) DEFAULT NULL, - `project_name` varchar(255) DEFAULT NULL, - `viz_type` varchar(255) DEFAULT NULL COMMENT 'dashboard/display', - `viz_id` bigint(20) DEFAULT NULL, - `viz_name` varchar(255) DEFAULT NULL, - `sub_viz_id` bigint(20) DEFAULT NULL, - `sub_viz_name` varchar(255) DEFAULT NULL, - `widget_id` bigint(20) DEFAULT NULL, - `widget_name` varchar(255) DEFAULT NULL, - `variables` varchar(500) DEFAULT NULL, - `filters` varchar(500) DEFAULT NULL, - `groups` varchar(500) DEFAULT NULL, - `create_time` timestamp NULL DEFAULT NULL, - PRIMARY KEY (`id`) USING BTREE -) ENGINE=InnoDB DEFAULT CHARSET=utf8; - -DROP TABLE IF EXISTS `davinci_statistic_terminal`; -CREATE TABLE `davinci_statistic_terminal` ( - `id` bigint(20) NOT NULL AUTO_INCREMENT, - `user_id` bigint(20) DEFAULT NULL, - `email` varchar(255) DEFAULT NULL, - `browser_name` varchar(255) DEFAULT NULL, - `browser_version` varchar(255) DEFAULT NULL, - `engine_name` varchar(255) DEFAULT NULL, - `engine_version` varchar(255) DEFAULT NULL, - `os_name` varchar(255) DEFAULT NULL, - `os_version` varchar(255) DEFAULT NULL, - `device_model` varchar(255) DEFAULT NULL, - `device_type` varchar(255) DEFAULT NULL, - `device_vendor` varchar(255) DEFAULT NULL, - `cpu_architecture` varchar(255) DEFAULT NULL, - `create_time` timestamp NULL DEFAULT NULL, - PRIMARY KEY (`id`) USING BTREE -) ENGINE=InnoDB DEFAULT CHARSET=utf8; - - -DROP TABLE IF EXISTS `davinci_statistic_duration`; -CREATE TABLE `davinci_statistic_duration` ( - `id` bigint(20) NOT NULL AUTO_INCREMENT, - `user_id` bigint(20) DEFAULT NULL, - `email` varchar(255) DEFAULT NULL, - `start_time` timestamp NULL DEFAULT NULL, - `end_time` timestamp NULL DEFAULT NULL, - PRIMARY KEY (`id`) USING BTREE -) ENGINE=InnoDB DEFAULT CHARSET=utf8; - -DROP TABLE IF EXISTS `share_download_record`; -CREATE TABLE `share_download_record` ( - `id` bigint(20) NOT NULL AUTO_INCREMENT, - `uuid` varchar(50) DEFAULT NULL, - `name` varchar(255) NOT NULL, - `path` varchar(255) DEFAULT NULL, - `status` smallint(1) NOT NULL, - `create_time` datetime NOT NULL, - `last_download_time` datetime DEFAULT NULL, - PRIMARY KEY (`id`) USING BTREE -) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4; - - -SET FOREIGN_KEY_CHECKS = 1; -INSERT INTO `source` (id,name,description,config,type,project_id,create_by,create_time,update_by,update_time,parent_id,full_parent_id,is_folder,`index`) VALUES (1,'hiveDataSource','','{"parameters":"","password":"","url":"test","username":"hiveDataSource-token"}','hive',-1,null,null,null,null,null,null,null,null); \ No newline at end of file diff --git a/db/dss_ddl.sql b/db/dss_ddl.sql index cdaf8fb1a..afa01040b 100644 --- a/db/dss_ddl.sql +++ b/db/dss_ddl.sql @@ -1,8 +1,150 @@ -SET FOREIGN_KEY_CHECKS=0; --- ---------------------------- --- Table structure for dss_application --- ---------------------------- +DROP TABLE IF EXISTS `dss_apiservice_api`; +CREATE TABLE `dss_apiservice_api` ( + `id` bigint(20) NOT NULL AUTO_INCREMENT COMMENT '主键', + `name` varchar(180) NOT NULL COMMENT '服务名称', + `alias_name` varchar(200) NOT NULL COMMENT '服务中文名称', + `path` varchar(180) NOT NULL COMMENT '服务路径', + `protocol` int(11) NOT NULL COMMENT '协议: http, https', + `method` varchar(10) NOT NULL COMMENT '方法: post, put, delete', + `tag` varchar(200) DEFAULT NULL COMMENT '标签', + `scope` varchar(50) DEFAULT NULL COMMENT '范围', + `description` varchar(200) DEFAULT NULL COMMENT '服务描述', + `status` int(11) DEFAULT '0' COMMENT '服务状态,默认0是停止,1是运行中,2是删除', + `type` varchar(50) DEFAULT NULL COMMENT '服务引擎类型', + `run_type` varchar(50) DEFAULT NULL COMMENT '脚本类型', + `create_time` timestamp DEFAULT CURRENT_TIMESTAMP COMMENT '创建时间', + `modify_time` timestamp DEFAULT CURRENT_TIMESTAMP COMMENT '修改时间', + `creator` varchar(50) DEFAULT NULL COMMENT '创建者', + `modifier` varchar(50) DEFAULT NULL COMMENT '修改者', + `script_path` varchar(180) NOT NULL COMMENT '脚本路径', + `workspaceID` int(11) NOT NULL COMMENT '工作空间ID', + `api_comment` varchar(1024) DEFAULT NULL COMMENT '服务备注', + PRIMARY KEY (`id`), + UNIQUE KEY `idx_uniq_config_name` (`name`), + UNIQUE KEY `idx_uniq_dconfig_path` (`path`), + KEY `idx_dss_script_path` (`script_path`) +) ENGINE=InnoDB AUTO_INCREMENT=1 DEFAULT CHARSET=utf8mb4 COMMENT='服务api配置表'; + +DROP TABLE IF EXISTS `dss_apiservice_param`; +CREATE TABLE `dss_apiservice_param` ( + `id` bigint(20) NOT NULL AUTO_INCREMENT COMMENT '主键', + `api_version_id` bigint(20) NOT NULL COMMENT '服务api版本id', + `name` varchar(200) NOT NULL COMMENT '名称', + `display_name` varchar(50) DEFAULT NULL COMMENT '展示名', + `type` varchar(50) DEFAULT NULL COMMENT '类型', + `required` tinyint(1) DEFAULT '1' COMMENT '是否必须: 0否, 1是', + `default_value` varchar(1024) DEFAULT NULL COMMENT '参数的默认值', + `description` varchar(200) DEFAULT NULL COMMENT '描述', + `details` varchar(1024) DEFAULT NULL COMMENT '变量的详细说明', + PRIMARY KEY (`id`), + KEY `idx_api_version_id` (`api_version_id`) +) ENGINE=InnoDB AUTO_INCREMENT=1 DEFAULT CHARSET=utf8mb4 COMMENT='apiservice 参数表'; + +DROP TABLE IF EXISTS `dss_apiservice_api_version`; +CREATE TABLE `dss_apiservice_api_version` ( + `id` bigint(20) NOT NULL AUTO_INCREMENT COMMENT '主键', + `api_id` bigint(20) NOT NULL COMMENT '服务的ID', + `version` varchar(50) NOT NULL COMMENT '服务对应的版本信息', + `bml_resource_id` varchar(50) NOT NULL COMMENT 'bml资源id', + `bml_version` varchar(20) NOT NULL COMMENT 'bml版本', + `source` varchar(200) DEFAULT NULL COMMENT '来源', + `creator` varchar(50) DEFAULT NULL COMMENT '创建者', + `create_time`timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP COMMENT '创建时间', + `status` tinyint(1) default '1' COMMENT '0表示被禁用,1表示正在运行', + `metadata_info` varchar(5000) NOT NULL COMMENT '发布者库表信息', + `auth_id` varchar(200) NOT NULL COMMENT '用于与datamap交互的UUID', + `datamap_order_no` varchar(200) DEFAULT NULL COMMENT 'datamap审批单号码', + PRIMARY KEY(`id`) +) ENGINE=InnoDB AUTO_INCREMENT=1 DEFAULT CHARSET=utf8mb4 COMMENT='服务api版本表'; + +DROP TABLE IF EXISTS `dss_apiservice_token_manager`; +CREATE TABLE `dss_apiservice_token_manager` ( + `id` bigint(20) NOT NULL AUTO_INCREMENT COMMENT '主键', + `api_version_id` bigint(20) NOT NULL COMMENT '服务api版本id', + `api_id` bigint(20) NOT NULL COMMENT '服务api配置id', + `publisher` varchar(20) NOT NULL COMMENT '发布用户', + `user` varchar(20) NOT NULL COMMENT '申请用户', + `apply_time` timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP COMMENT '申请时间', + `duration` int(10) NOT NULL COMMENT '时长', + `reason` varchar(200) DEFAULT NULL COMMENT '申请原因', + `ip_whitelist` varchar(200) DEFAULT NULL COMMENT 'IP白名单', + `status` tinyint(1) DEFAULT '1' COMMENT '状态0过期,1有效期内', + `caller` varchar(50) DEFAULT NULL COMMENT '调用方', + `access_limit` varchar(50) DEFAULT NULL COMMENT '限流情况', + `apply_source` varchar(200) DEFAULT NULL COMMENT '申请来源', + `token` varchar(500) DEFAULT NULL COMMENT 'token内容', + PRIMARY KEY (`id`) +) ENGINE=InnoDB AUTO_INCREMENT=1 DEFAULT CHARSET=utf8mb4 COMMENT='apiservice token管理表'; + +DROP TABLE IF EXISTS `dss_apiservice_approval`; +CREATE TABLE `dss_apiservice_approval` ( + `id` bigint(20) NOT NULL AUTO_INCREMENT COMMENT '主键', + `api_id` bigint(20) NOT NULL COMMENT '服务api id', + `api_version_id` bigint(20) NOT NULL COMMENT '版本id', + `approval_name` varchar(50) NOT NULL COMMENT '审批单名称', + `apply_user` varchar(1024) NOT NULL COMMENT '申请用户', + `execute_user` varchar(50) DEFAULT NULL COMMENT '代理执行用户,用,分割', + `creator` varchar(50) NOT NULL COMMENT '创建者', + `status` int(10) DEFAULT '0' COMMENT '申请状态,提单成功1,审批中2,成功3,失败4', + `create_time` timestamp NOT null DEFAULT CURRENT_TIMESTAMP COMMENT '审批单创建时间', + `update_time` timestamp NOT null DEFAULT CURRENT_TIMESTAMP COMMENT '审批单状态更新时间', + `approval_no` varchar(500) NOT NULL COMMENT '审批单号', + PRIMARY KEY(`id`), + UNIQUE KEY `idx_uniq_api_version_id` (`api_version_id`) +) ENGINE=InnoDB AUTO_INCREMENT=1 DEFAULT CHARSET=utf8mb4 COMMENT='apiservice 审批单表'; + +DROP TABLE IF EXISTS `dss_apiservice_access_info`; +CREATE TABLE `dss_apiservice_access_info` ( + `id` bigint(20) NOT NULL AUTO_INCREMENT COMMENT '主键', + `api_id` bigint(20) NOT NULL COMMENT '服务id', + `api_version_id` bigint(20) NOT NULL COMMENT '版本id', + `api_name` varchar(50) NOT NULL COMMENT '服务名称', + `login_user` varchar(50) NOT NULL COMMENT '提交用户', + `execute_user` varchar(50) DEFAULT NULL COMMENT '代理执行用户', + `api_publisher` varchar(50) NOT NULL COMMENT 'api创建者', + `access_time` timestamp NOT null DEFAULT CURRENT_TIMESTAMP COMMENT '访问时间', + PRIMARY KEY(`id`) +) ENGINE=InnoDB AUTO_INCREMENT=1 DEFAULT CHARSET=utf8mb4 COMMENT='apiservice 访问信息表'; + + +DROP TABLE IF EXISTS `dss_appconn`; +CREATE TABLE `dss_appconn` ( + `id` int(20) NOT NULL AUTO_INCREMENT COMMENT '主键', + `appconn_name` varchar(64) UNIQUE NOT NULL COMMENT 'appconn的名称', + `is_user_need_init` tinyint(1) DEFAULT NULL COMMENT '是否需要初始化', + `level` int(8) DEFAULT NULL COMMENT '等级', + `if_iframe` tinyint(1) DEFAULT NULL COMMENT '是否能iframe嵌入', + `is_external` tinyint(1) DEFAULT NULL COMMENT '是否是外部接入应用', + `reference` varchar(255) DEFAULT NULL COMMENT '需要关联的某一个AppConn标识', + `class_name` varchar(255) DEFAULT NULL COMMENT '需要关联的某一个AppConn标识', + `appconn_class_path` varchar(255) DEFAULT NULL COMMENT '需要关联的某一个AppConn标识', + `resource` varchar(255) DEFAULT NULL COMMENT 'bml的资源ID', + PRIMARY KEY (`id`), + UNIQUE KEY `idx_appconn_name` (`appconn_name`) +) ENGINE=InnoDB AUTO_INCREMENT=8 DEFAULT CHARSET=utf8mb4 COMMENT='dss appconn表'; + +DROP TABLE IF EXISTS `dss_appconn_instance`; +CREATE TABLE `dss_appconn_instance` ( + `id` int(20) NOT NULL AUTO_INCREMENT COMMENT '主键', + `appconn_id` int(20) NOT NULL COMMENT 'appconn的主键', + `label` varchar(128) NOT NULL COMMENT '实例的标签', + `url` varchar(128) DEFAULT NULL COMMENT '访问第三方的url', + `enhance_json` varchar(1024) DEFAULT NULL COMMENT 'json格式的配置', + `homepage_url` varchar(255) DEFAULT NULL COMMENT '主页url', + `redirect_url` varchar(255) DEFAULT NULL COMMENT '重定向url', + PRIMARY KEY (`id`) +) ENGINE=InnoDB AUTO_INCREMENT=1 DEFAULT CHARSET=utf8mb4 COMMENT='dss instance的实例表'; + +DROP TABLE IF EXISTS `dss_appconn_project_relation`; +CREATE TABLE `dss_appconn_project_relation` ( + `id` bigint(20) NOT NULL AUTO_INCREMENT, + `project_id` bigint(20) NOT NULL, + `appconn_instance_id` bigint(20) NOT NULL, + `appconn_instance_project_id` bigint(20) NOT NULL, + PRIMARY KEY (`id`) USING BTREE +) ENGINE=InnoDB AUTO_INCREMENT=90 DEFAULT CHARSET=utf8mb4 ROW_FORMAT=COMPACT; + DROP TABLE IF EXISTS `dss_application`; CREATE TABLE `dss_application` ( `id` int(20) NOT NULL AUTO_INCREMENT, @@ -18,65 +160,186 @@ CREATE TABLE `dss_application` ( `homepage_url` varchar(255) DEFAULT NULL, `redirect_url` varchar(255) DEFAULT NULL, PRIMARY KEY (`id`) -) ENGINE=InnoDB DEFAULT CHARSET=utf8; - +) ENGINE=InnoDB AUTO_INCREMENT=18 DEFAULT CHARSET=utf8; --- ---------------------------- --- Table structure for dss_application_user_init_result --- ---------------------------- DROP TABLE IF EXISTS `dss_application_user_init_result`; CREATE TABLE `dss_application_user_init_result` ( `id` bigint(20) NOT NULL AUTO_INCREMENT, `application_id` int(11) DEFAULT NULL, `result` varchar(255) DEFAULT NULL, - `user_id` bigint(20) DEFAULT NULL, + `username` varchar(32) DEFAULT NULL, `is_init_success` tinyint(1) DEFAULT NULL, PRIMARY KEY (`id`) ) ENGINE=InnoDB DEFAULT CHARSET=utf8; - --- ---------------------------- --- Table structure for dss_workflow_node --- ---------------------------- -DROP TABLE IF EXISTS `dss_workflow_node`; -CREATE TABLE `dss_workflow_node` ( - `id` int(11) NOT NULL AUTO_INCREMENT, - `icon` text, - `node_type` varchar(255) DEFAULT NULL, - `application_id` int(20) DEFAULT NULL, - `submit_to_scheduler` tinyint(1) DEFAULT NULL, - `enable_copy` tinyint(1) DEFAULT NULL, - `should_creation_before_node` tinyint(1) DEFAULT NULL, - `support_jump` tinyint(1) DEFAULT NULL, - `jump_url` varchar(255) DEFAULT NULL, +DROP TABLE IF EXISTS `dss_component_info`; +CREATE TABLE `dss_component_info` ( + `id` int(20) NOT NULL AUTO_INCREMENT, + `title` varchar(128) NOT NULL, + `icon` varchar(64) NOT NULL, + `desc` varchar(1024) NOT NULL, + `button_text` varchar(64) NOT NULL, + `menu_id` int(10) NOT NULL, + `application_id` int(10) DEFAULT '0', + `user_manual_url` varchar(512) DEFAULT NULL, + `indicator_url` varchar(512) DEFAULT NULL, PRIMARY KEY (`id`) ) ENGINE=InnoDB DEFAULT CHARSET=utf8; --- ---------------------------- --- Table structure for dss_flow --- ---------------------------- -DROP TABLE IF EXISTS `dss_flow`; -CREATE TABLE `dss_flow` ( +DROP TABLE IF EXISTS `dss_component_role`; +CREATE TABLE `dss_component_role` ( `id` bigint(20) NOT NULL AUTO_INCREMENT, - `name` varchar(128) DEFAULT NULL, - `state` tinyint(1) DEFAULT NULL, - `source` varchar(255) DEFAULT NULL, - `description` varchar(255) DEFAULT NULL, + `workspace_id` bigint(20) DEFAULT NULL, + `component_id` int(20) DEFAULT NULL, + `role_id` int(20) DEFAULT NULL, + `priv` int(20) DEFAULT NULL, + `update_time` datetime DEFAULT NULL, + `updateby` varchar(255) DEFAULT NULL, + PRIMARY KEY (`id`) +) ENGINE=InnoDB AUTO_INCREMENT=5103 DEFAULT CHARSET=utf8; + +DROP TABLE IF EXISTS `dss_datawrangler_export`; +CREATE TABLE `dss_datawrangler_export` ( + `id` int(20) NOT NULL AUTO_INCREMENT, + `spread_sheet_id` int(20) NOT NULL, + `sheet_id` int(20) DEFAULT NULL, + `data_sink` varchar(1000) DEFAULT NULL, + `user_id` varchar(120) DEFAULT '0', `create_time` datetime DEFAULT NULL, - `creator_id` bigint(20) DEFAULT NULL, - `is_root_flow` tinyint(1) DEFAULT NULL, - `rank` int(10) DEFAULT NULL, - `project_id` bigint(20) DEFAULT NULL, - `has_saved` tinyint(1) DEFAULT NULL, - `uses` varchar(255) DEFAULT NULL, - PRIMARY KEY (`id`) USING BTREE, - UNIQUE KEY `name` (`name`,`project_id`) USING BTREE -) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 ROW_FORMAT=COMPACT; + `last_update_time` datetime DEFAULT NULL, + `status` varchar(10) DEFAULT NULL, + PRIMARY KEY (`id`) +) ENGINE=InnoDB DEFAULT CHARSET=utf8; + +DROP TABLE IF EXISTS `dss_datawrangler_sheet`; +CREATE TABLE `dss_datawrangler_sheet` ( + `id` int(20) NOT NULL AUTO_INCREMENT, + `name` varchar(120) NOT NULL, + `spread_sheet_id` int(20) NOT NULL, + `order` int(3) DEFAULT NULL, + `data_source` varchar(1000) DEFAULT '0', + `content_location` varchar(500) DEFAULT NULL, + `operation_location` varchar(500) DEFAULT NULL, + `create_time` datetime DEFAULT NULL, + `last_update_time` datetime DEFAULT NULL, + `config` text, + `is_limited` tinyint(1) DEFAULT '1', + PRIMARY KEY (`id`) +) ENGINE=InnoDB AUTO_INCREMENT=4 DEFAULT CHARSET=utf8; + +DROP TABLE IF EXISTS `dss_datawrangler_spreadsheet`; +CREATE TABLE `dss_datawrangler_spreadsheet` ( + `id` int(20) NOT NULL AUTO_INCREMENT, + `name` varchar(120) NOT NULL, + `source` varchar(128) DEFAULT NULL, + `workspace` varchar(120) DEFAULT NULL, + `is_hidden` tinyint(1) DEFAULT '0', + `config` text, + `description` varchar(500) DEFAULT NULL, + `create_time` datetime DEFAULT NULL, + `last_update_time` datetime DEFAULT NULL, + `last_access_time` datetime DEFAULT NULL, + `access_num` int(10) DEFAULT NULL, + `user_id` varchar(120) DEFAULT NULL, + PRIMARY KEY (`id`) +) ENGINE=InnoDB AUTO_INCREMENT=4 DEFAULT CHARSET=utf8; + +DROP TABLE IF EXISTS `dss_datawrangler_template`; +CREATE TABLE `dss_datawrangler_template` ( + `id` int(20) NOT NULL AUTO_INCREMENT, + `name` varchar(120) NOT NULL, + `sheet_id` int(20) NOT NULL, + `source` varchar(120) DEFAULT NULL, + `workspace` varchar(120) DEFAULT '0', + `operation_location` varchar(500) DEFAULT NULL, + `description` varchar(500) DEFAULT NULL, + `create_time` datetime DEFAULT NULL, + `last_update_time` datetime DEFAULT NULL, + `user_id` varchar(120) DEFAULT NULL, + PRIMARY KEY (`id`) +) ENGINE=InnoDB DEFAULT CHARSET=utf8; + +DROP TABLE IF EXISTS `dss_dev_flow`; +CREATE TABLE `dss_dev_flow` ( + `id` int(11) NOT NULL AUTO_INCREMENT COMMENT '主键ID,默认为0,所有空间都有', + `workspace_id` int(11) DEFAULT '0' COMMENT '空间ID,默认为0,所有空间都有', + `type` int(1) DEFAULT '0' COMMENT '类型: 0-空间开发流程,1-工程开发流程,2-工程编排模式', + `dev_name` varchar(200) DEFAULT NULL COMMENT '名称', + `dev_code` varchar(200) NOT NULL COMMENT '编码,可以当做checkbox或radio中的value来使用,赋值可以当做英文名称来使用', + `title` varchar(200) DEFAULT NULL COMMENT '标题', + `url` varchar(200) DEFAULT NULL COMMENT 'url', + `url_type` int(1) DEFAULT '0' COMMENT 'url类型: 0-内部系统,1-外部系统;默认是内部', + `icon` varchar(200) DEFAULT NULL COMMENT '图标', + `dev_desc` varchar(500) DEFAULT NULL COMMENT '描述', + `order_num` int(2) DEFAULT '1' COMMENT '序号', + `remark` varchar(200) DEFAULT NULL COMMENT '备注', + `create_user` varchar(100) DEFAULT NULL COMMENT '创建人', + `create_time` datetime DEFAULT CURRENT_TIMESTAMP COMMENT '创建时间', + `update_user` varchar(100) DEFAULT NULL COMMENT '更新人', + `update_time` datetime DEFAULT CURRENT_TIMESTAMP COMMENT '更新时间', + PRIMARY KEY (`id`), + UNIQUE KEY `idx_unique_workspace_id` (`workspace_id`,`type`,`dev_code`) +) ENGINE=InnoDB AUTO_INCREMENT=6 DEFAULT CHARSET=utf8 COMMENT='开发流程/编排模式等配置表'; + +DROP TABLE IF EXISTS `dss_dictionary`; +CREATE TABLE `dss_dictionary` ( + `id` int(11) NOT NULL AUTO_INCREMENT COMMENT '主键ID', + `workspace_id` int(11) DEFAULT '0' COMMENT '空间ID,默认为0,所有空间都有', + `parent_key` varchar(200) DEFAULT '0' COMMENT '父key', + `dic_name` varchar(200) NOT NULL COMMENT '名称', + `dic_name_en` varchar(300) DEFAULT NULL COMMENT '名称(英文)', + `dic_key` varchar(200) NOT NULL COMMENT 'key 相当于编码,空间是w_开头,工程是p_', + `dic_value` varchar(500) DEFAULT NULL COMMENT 'key对应的值', + `dic_value_en` varchar(1000) DEFAULT NULL COMMENT 'key对应的值(英文)', + `title` varchar(200) DEFAULT NULL COMMENT '标题', + `title_en` varchar(400) DEFAULT NULL COMMENT '标题(英文)', + `url` varchar(200) DEFAULT NULL COMMENT 'url', + `url_type` int(1) DEFAULT '0' COMMENT 'url类型: 0-内部系统,1-外部系统;默认是内部', + `icon` varchar(200) DEFAULT NULL COMMENT '图标', + `order_num` int(2) DEFAULT '1' COMMENT '序号', + `remark` varchar(1000) DEFAULT NULL COMMENT '备注', + `create_user` varchar(100) DEFAULT NULL COMMENT '创建人', + `create_time` datetime DEFAULT CURRENT_TIMESTAMP COMMENT '创建时间', + `update_user` varchar(100) DEFAULT NULL COMMENT '更新人', + `update_time` datetime DEFAULT CURRENT_TIMESTAMP COMMENT '更新时间', + PRIMARY KEY (`id`), + UNIQUE KEY `idx_unique_workspace_id` (`workspace_id`,`dic_key`), + KEY `idx_parent_key` (`parent_key`), + KEY `idx_dic_key` (`dic_key`) +) ENGINE=InnoDB AUTO_INCREMENT=24 DEFAULT CHARSET=utf8 COMMENT='数据字典表'; + +DROP TABLE IF EXISTS `dss_event_relation`; +CREATE TABLE `dss_event_relation` ( + `id` bigint(20) NOT NULL AUTO_INCREMENT, + `project_version_id` bigint(20) NOT NULL, + `flow_id` bigint(20) NOT NULL, + `msg_type` varchar(45) NOT NULL, + `msg_topic` varchar(45) NOT NULL, + `msg_name` varchar(45) NOT NULL, + `msg_sender` varchar(45) DEFAULT NULL, + `msg_receiver` varchar(45) DEFAULT NULL, + `node_json` varchar(4096) DEFAULT NULL, + `project_id` bigint(20) NOT NULL DEFAULT '0', + PRIMARY KEY (`id`) +) ENGINE=InnoDB DEFAULT CHARSET=utf8 COMMENT='save eventchecker info for application map'; + +DROP TABLE IF EXISTS `dss_flow_edit_lock`; +CREATE TABLE `dss_flow_edit_lock` ( + `id` int(11) NOT NULL AUTO_INCREMENT, + `flow_id` bigint(11) NOT NULL, + `flow_version` varchar(16) NOT NULL, + `project_version_id` bigint(11) NOT NULL, + `create_time` datetime NOT NULL DEFAULT CURRENT_TIMESTAMP, + `update_time` datetime NOT NULL DEFAULT CURRENT_TIMESTAMP, + `owner` varchar(128) NOT NULL, + `lock_stamp` int(8) NOT NULL DEFAULT '0', + `is_expire` tinyint(1) NOT NULL DEFAULT '0', + `lock_content` varchar(512) NOT NULL, + PRIMARY KEY (`id`), + UNIQUE KEY `flow_lock` (`flow_id`,`flow_version`,`project_version_id`) +) ENGINE=InnoDB DEFAULT CHARSET=utf8; --- ---------------------------- --- Table structure for dss_flow_publish_history --- ---------------------------- DROP TABLE IF EXISTS `dss_flow_publish_history`; CREATE TABLE `dss_flow_publish_history` ( `id` bigint(20) NOT NULL AUTO_INCREMENT, @@ -90,76 +353,299 @@ CREATE TABLE `dss_flow_publish_history` ( PRIMARY KEY (`id`) USING BTREE ) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 ROW_FORMAT=COMPACT; --- ---------------------------- --- Table structure for dss_flow_relation --- ---------------------------- DROP TABLE IF EXISTS `dss_flow_relation`; CREATE TABLE `dss_flow_relation` ( `id` bigint(20) NOT NULL AUTO_INCREMENT, `flow_id` bigint(20) DEFAULT NULL, `parent_flow_id` bigint(20) DEFAULT NULL, PRIMARY KEY (`id`) USING BTREE -) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 ROW_FORMAT=COMPACT; +) ENGINE=InnoDB AUTO_INCREMENT=78 DEFAULT CHARSET=utf8mb4 ROW_FORMAT=COMPACT; --- ---------------------------- --- Table structure for dss_flow_taxonomy --- ---------------------------- -DROP TABLE IF EXISTS `dss_flow_taxonomy`; -CREATE TABLE `dss_flow_taxonomy` ( +DROP TABLE IF EXISTS `dss_flow_schedule_info`; +CREATE TABLE `dss_flow_schedule_info` ( `id` bigint(20) NOT NULL AUTO_INCREMENT, - `name` varchar(20) DEFAULT NULL, - `description` varchar(255) DEFAULT NULL, - `creator_id` int(11) DEFAULT NULL, - `create_time` datetime DEFAULT NULL, - `update_time` datetime DEFAULT NULL, - `project_id` bigint(20) DEFAULT NULL, - PRIMARY KEY (`id`) USING BTREE, - UNIQUE KEY `name` (`name`,`project_id`) USING BTREE -) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 ROW_FORMAT=COMPACT; - --- ---------------------------- --- Table structure for dss_flow_taxonomy_relation --- ---------------------------- -DROP TABLE IF EXISTS `dss_flow_taxonomy_relation`; -CREATE TABLE `dss_flow_taxonomy_relation` ( - `taxonomy_id` bigint(20) NOT NULL, - `flow_id` bigint(20) NOT NULL -) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 ROW_FORMAT=COMPACT; + `flow_id` bigint(20) NOT NULL, + `project_id` bigint(20) NOT NULL, + `workspace_id` bigint(20) NOT NULL, + `schedule_time` varchar(4096) COLLATE utf8_bin DEFAULT NULL, + `alarm_level` varchar(32) COLLATE utf8_bin DEFAULT NULL, + `alarm_user_emails` varchar(4096) COLLATE utf8_bin DEFAULT NULL, + `last_update_time` datetime DEFAULT CURRENT_TIMESTAMP, + PRIMARY KEY (`id`) +) ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_bin ROW_FORMAT=COMPACT; +DROP TABLE IF EXISTS `dss_flow_user`; +CREATE TABLE `dss_flow_user` ( + `id` bigint(20) NOT NULL AUTO_INCREMENT, + `flow_id` bigint(20) NOT NULL, + `project_id` bigint(20) NOT NULL, + `username` varchar(100) COLLATE utf8_bin NOT NULL, + `workspace_id` bigint(20) NOT NULL, + `priv` tinyint(5) NOT NULL DEFAULT '0', + `last_update_time` datetime DEFAULT NULL, + PRIMARY KEY (`id`) +) ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_bin ROW_FORMAT=COMPACT; --- ---------------------------- --- Table structure for dss_flow_version --- ---------------------------- DROP TABLE IF EXISTS `dss_flow_version`; CREATE TABLE `dss_flow_version` ( `id` bigint(20) NOT NULL AUTO_INCREMENT, `flow_id` bigint(20) DEFAULT NULL, `source` varchar(255) DEFAULT NULL, `version` varchar(255) DEFAULT NULL, + `bml_version` varchar(255) DEFAULT NULL, `json_path` text, `comment` varchar(255) DEFAULT NULL, `update_time` datetime DEFAULT NULL, - `updator_id` bigint(255) DEFAULT NULL, + `updator` varchar(32) DEFAULT NULL, `project_version_id` bigint(20) DEFAULT NULL, PRIMARY KEY (`id`) USING BTREE -) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 ROW_FORMAT=COMPACT; +) ENGINE=InnoDB AUTO_INCREMENT=2668 DEFAULT CHARSET=utf8mb4 ROW_FORMAT=COMPACT; +DROP TABLE IF EXISTS `dss_homepage_demo_instance`; +CREATE TABLE `dss_homepage_demo_instance` ( + `id` int(20) NOT NULL AUTO_INCREMENT, + `menu_id` int(20) DEFAULT NULL, + `name` varchar(64) DEFAULT NULL, + `url` varchar(128) DEFAULT NULL, + `title_en` varchar(64) DEFAULT NULL, + `title_cn` varchar(64) DEFAULT NULL, + `description` varchar(255) DEFAULT NULL, + `is_active` tinyint(1) DEFAULT '1', + `icon` varchar(255) DEFAULT NULL, + `order` int(2) DEFAULT NULL, + `click_num` int(11) DEFAULT '0', + `create_by` varchar(255) DEFAULT NULL, + `create_time` datetime DEFAULT NULL, + `last_update_time` datetime DEFAULT NULL, + `last_update_user` varchar(30) DEFAULT NULL, + PRIMARY KEY (`id`) +) ENGINE=InnoDB AUTO_INCREMENT=10 DEFAULT CHARSET=utf8; + +DROP TABLE IF EXISTS `dss_homepage_demo_menu`; +CREATE TABLE `dss_homepage_demo_menu` ( + `id` int(20) NOT NULL AUTO_INCREMENT, + `name` varchar(64) DEFAULT NULL, + `title_en` varchar(64) DEFAULT NULL, + `title_cn` varchar(64) DEFAULT NULL, + `description` varchar(255) DEFAULT NULL, + `is_active` tinyint(1) DEFAULT '1', + `icon` varchar(255) DEFAULT NULL, + `order` int(2) DEFAULT NULL, + `create_by` varchar(255) DEFAULT NULL, + `create_time` datetime DEFAULT NULL, + `last_update_time` datetime DEFAULT NULL, + `last_update_user` varchar(30) DEFAULT NULL, + PRIMARY KEY (`id`) +) ENGINE=InnoDB AUTO_INCREMENT=4 DEFAULT CHARSET=utf8; + +DROP TABLE IF EXISTS `dss_homepage_video`; +CREATE TABLE `dss_homepage_video` ( + `id` int(20) NOT NULL AUTO_INCREMENT, + `name` varchar(64) DEFAULT NULL, + `url` varchar(128) DEFAULT NULL, + `title_en` varchar(64) DEFAULT NULL, + `title_cn` varchar(64) DEFAULT NULL, + `description` varchar(255) DEFAULT NULL, + `is_active` tinyint(1) DEFAULT '1', + `icon` varchar(255) DEFAULT NULL, + `order` int(2) DEFAULT NULL, + `play_num` int(11) DEFAULT '0', + `create_by` varchar(255) DEFAULT NULL, + `create_time` datetime DEFAULT NULL, + `last_update_time` datetime DEFAULT NULL, + `last_update_user` varchar(30) DEFAULT NULL, + PRIMARY KEY (`id`) +) ENGINE=InnoDB AUTO_INCREMENT=3 DEFAULT CHARSET=utf8; + +DROP TABLE IF EXISTS `dss_input_relation`; +CREATE TABLE `dss_input_relation` ( + `id` bigint(11) NOT NULL AUTO_INCREMENT, + `type` varchar(16) DEFAULT NULL, + `source_env` varchar(16) DEFAULT NULL, + `source_id` bigint(20) DEFAULT NULL, + `target_env` varchar(16) DEFAULT NULL, + `target_id` bigint(20) DEFAULT NULL, + PRIMARY KEY (`id`) +) ENGINE=InnoDB AUTO_INCREMENT=4 DEFAULT CHARSET=utf8; + +DROP TABLE IF EXISTS `dss_menu`; +CREATE TABLE `dss_menu` ( + `id` int(20) NOT NULL AUTO_INCREMENT, + `name` varchar(255) DEFAULT NULL, + `level` varchar(255) DEFAULT NULL, + `upper_menu_id` int(20) DEFAULT NULL, + `front_name` varchar(255) DEFAULT NULL, + `comment` varchar(255) DEFAULT NULL, + `description` varchar(255) DEFAULT NULL, + `is_active` tinyint(4) DEFAULT '1', + `is_component` tinyint(1) NOT NULL DEFAULT '0', + `icon` varchar(128) DEFAULT NULL, + `application_id` int(11) DEFAULT '0', + PRIMARY KEY (`id`) +) ENGINE=InnoDB AUTO_INCREMENT=34 DEFAULT CHARSET=utf8; + +DROP TABLE IF EXISTS `dss_menu_component_url`; +CREATE TABLE `dss_menu_component_url` ( + `id` int(10) NOT NULL AUTO_INCREMENT, + `menu_id` int(10) NOT NULL, + `dss_application_id` int(11) DEFAULT NULL, + `url` varchar(512) COLLATE utf8_bin NOT NULL, + `manul_url` varchar(512) COLLATE utf8_bin DEFAULT NULL, + `operation_url` varchar(512) COLLATE utf8_bin DEFAULT NULL, + `update_time` datetime DEFAULT CURRENT_TIMESTAMP, + PRIMARY KEY (`id`) +) ENGINE=InnoDB AUTO_INCREMENT=17 DEFAULT CHARSET=utf8 COLLATE=utf8_bin ROW_FORMAT=COMPACT; + +DROP TABLE IF EXISTS `dss_menu_page_relation`; +CREATE TABLE `dss_menu_page_relation` ( + `id` int(20) NOT NULL AUTO_INCREMENT, + PRIMARY KEY (`id`) +) ENGINE=InnoDB DEFAULT CHARSET=utf8; + +DROP TABLE IF EXISTS `dss_menu_role`; +CREATE TABLE `dss_menu_role` ( + `id` int(20) NOT NULL AUTO_INCREMENT, + `workspace_id` int(20) DEFAULT NULL, + `menu_id` int(20) DEFAULT NULL, + `role_id` int(20) DEFAULT NULL, + `priv` int(20) DEFAULT NULL, + `update_time` datetime DEFAULT NULL, + `updateby` varchar(255) DEFAULT NULL, + PRIMARY KEY (`id`) +) ENGINE=InnoDB AUTO_INCREMENT=5263 DEFAULT CHARSET=utf8; + +DROP TABLE IF EXISTS `dss_onestop_menu`; +CREATE TABLE `dss_onestop_menu` ( + `id` int(20) NOT NULL AUTO_INCREMENT, + `name` varchar(64) DEFAULT NULL, + `title_en` varchar(64) DEFAULT NULL, + `title_cn` varchar(64) DEFAULT NULL, + `description` varchar(255) DEFAULT NULL, + `is_active` tinyint(1) DEFAULT '1', + `icon` varchar(255) DEFAULT NULL, + `order` int(2) DEFAULT NULL, + `create_by` varchar(255) DEFAULT NULL, + `create_time` datetime DEFAULT NULL, + `last_update_time` datetime DEFAULT NULL, + `last_update_user` varchar(30) DEFAULT NULL, + PRIMARY KEY (`id`) +) ENGINE=InnoDB AUTO_INCREMENT=6 DEFAULT CHARSET=utf8; + +DROP TABLE IF EXISTS `dss_onestop_menu_application`; +CREATE TABLE `dss_onestop_menu_application` ( + `id` int(20) NOT NULL AUTO_INCREMENT, + `application_id` int(20) DEFAULT NULL, + `onestop_menu_id` int(20) NOT NULL, + `title_en` varchar(64) DEFAULT NULL, + `title_cn` varchar(64) DEFAULT NULL, + `desc_en` varchar(255) DEFAULT NULL, + `desc_cn` varchar(255) DEFAULT NULL, + `labels_en` varchar(255) DEFAULT NULL, + `labels_cn` varchar(255) DEFAULT NULL, + `is_active` tinyint(1) DEFAULT NULL, + `access_button_en` varchar(64) DEFAULT NULL, + `access_button_cn` varchar(64) DEFAULT NULL, + `manual_button_en` varchar(64) DEFAULT NULL, + `manual_button_cn` varchar(64) DEFAULT NULL, + `manual_button_url` varchar(255) DEFAULT NULL, + `icon` varchar(255) DEFAULT NULL, + `order` int(2) DEFAULT NULL, + `create_by` varchar(255) DEFAULT NULL, + `create_time` datetime DEFAULT NULL, + `last_update_time` datetime DEFAULT NULL, + `last_update_user` varchar(30) DEFAULT NULL, + `image` varchar(200) DEFAULT NULL COMMENT '图片', + PRIMARY KEY (`id`) +) ENGINE=InnoDB AUTO_INCREMENT=14 DEFAULT CHARSET=utf8; + + +DROP TABLE IF EXISTS `dss_onestop_user_favorites`; +CREATE TABLE `dss_onestop_user_favorites` ( + `id` int(20) NOT NULL AUTO_INCREMENT, + `username` varchar(64) DEFAULT NULL, + `workspace_id` bigint(20) DEFAULT '1', + `menu_application_id` int(20) DEFAULT NULL, + `order` int(2) DEFAULT NULL, + `create_by` varchar(255) DEFAULT NULL, + `create_time` datetime DEFAULT NULL, + `last_update_time` datetime DEFAULT NULL, + `last_update_user` varchar(30) DEFAULT NULL, + PRIMARY KEY (`id`) +) ENGINE=InnoDB AUTO_INCREMENT=94 DEFAULT CHARSET=utf8; + +DROP TABLE IF EXISTS `dss_orchestrator_info`; +CREATE TABLE `dss_orchestrator_info` ( + `id` bigint(20) NOT NULL AUTO_INCREMENT, + `name` varchar(255) NOT NULL, + `type` varchar(255) NOT NULL, + `desc` varchar(1024) DEFAULT NULL, + `creator` varchar(100) NOT NULL, + `create_time` datetime DEFAULT NULL, + `project_id` bigint(20) DEFAULT NULL, + `uses` varchar(500) DEFAULT NULL, + `appconn_name` varchar(1024) NOT NULL, + `uuid` varchar(180) NOT NULL, + `secondary_type` varchar(500) DEFAULT NULL, + `is_published` tinyint(1) NOT NULL DEFAULT '0', + PRIMARY KEY (`id`) USING BTREE, + UNIQUE KEY `unique_idx_uuid` (`uuid`) + ) ENGINE=InnoDB AUTO_INCREMENT=326 DEFAULT CHARSET=utf8mb4 ROW_FORMAT=COMPACT; + +DROP TABLE IF EXISTS `dss_orchestrator_schedule_info`; +CREATE TABLE `dss_orchestrator_schedule_info` ( + `id` bigint(20) NOT NULL AUTO_INCREMENT, + `orchestrator_id` bigint(20) NOT NULL, + `project_name` varchar(1024) COLLATE utf8_bin NOT NULL, + `schedule_user` varchar(128) COLLATE utf8_bin DEFAULT NULL, + `schedule_time` varchar(4096) COLLATE utf8_bin DEFAULT NULL, + `alarm_level` varchar(32) COLLATE utf8_bin DEFAULT NULL, + `alarm_user_emails` varchar(4096) COLLATE utf8_bin DEFAULT NULL, + `last_update_time` datetime DEFAULT CURRENT_TIMESTAMP, + `active_flag` VARCHAR(10) DEFAULT 'true' COMMENT '调度标示:true-已启动;false-已禁用', + PRIMARY KEY (`id`) +) ENGINE=InnoDB AUTO_INCREMENT=39 DEFAULT CHARSET=utf8 COLLATE=utf8_bin ROW_FORMAT=COMPACT; + +DROP TABLE IF EXISTS `dss_orchestrator_user`; +CREATE TABLE `dss_orchestrator_user` ( + `id` bigint(20) NOT NULL AUTO_INCREMENT, + `orchestrator_id` bigint(20) NOT NULL, + `project_id` bigint(20) NOT NULL, + `workspace_id` int(10) NOT NULL DEFAULT '0', + `username` varchar(100) COLLATE utf8_bin NOT NULL, + `priv` tinyint(5) NOT NULL DEFAULT '0', + `last_update_time` datetime DEFAULT NULL, + PRIMARY KEY (`id`) +) ENGINE=InnoDB AUTO_INCREMENT=58 DEFAULT CHARSET=utf8 COLLATE=utf8_bin ROW_FORMAT=COMPACT; + +DROP TABLE IF EXISTS `dss_orchestrator_version_info`; +CREATE TABLE `dss_orchestrator_version_info` ( + `id` bigint(20) NOT NULL AUTO_INCREMENT, + `orchestrator_id` bigint(20) NOT NULL, + `app_id` bigint(20) DEFAULT NULL, + `source` varchar(255) DEFAULT NULL, + `version` varchar(255) DEFAULT NULL, + `comment` varchar(255) DEFAULT NULL, + `update_time` datetime DEFAULT NULL, + `updater` varchar(32) DEFAULT NULL, + `project_id` bigint(20) DEFAULT NULL, + `content` varchar(255) DEFAULT NULL, + `context_id` varchar(200) DEFAULT NULL COMMENT '上下文ID', + PRIMARY KEY (`id`) USING BTREE +) ENGINE=InnoDB AUTO_INCREMENT=422 DEFAULT CHARSET=utf8mb4 ROW_FORMAT=COMPACT; --- ---------------------------- --- Table structure for dss_project --- ---------------------------- DROP TABLE IF EXISTS `dss_project`; CREATE TABLE `dss_project` ( `id` bigint(20) NOT NULL AUTO_INCREMENT, `name` varchar(200) COLLATE utf8_bin DEFAULT NULL, `source` varchar(50) COLLATE utf8_bin DEFAULT NULL COMMENT 'Source of the dss_project', `description` text COLLATE utf8_bin, - `workspace_id` bigint(20) DEFAULT 1, `user_id` bigint(20) DEFAULT NULL, + `username` varchar(32) COLLATE utf8_bin DEFAULT NULL, + `workspace_id` int(11) NOT NULL DEFAULT '0', `create_time` datetime DEFAULT NULL, - `create_by` bigint(20) DEFAULT NULL, + `create_by` varchar(128) COLLATE utf8_bin DEFAULT NULL COMMENT '创建人', `update_time` datetime DEFAULT NULL, - `update_by` bigint(20) DEFAULT NULL, + `update_by` varchar(128) COLLATE utf8_bin DEFAULT NULL COMMENT '修改人', `org_id` bigint(20) DEFAULT NULL COMMENT 'Organization ID', `visibility` bit(1) DEFAULT NULL, `is_transfer` bit(1) DEFAULT NULL COMMENT 'Reserved word', @@ -170,13 +656,15 @@ CREATE TABLE `dss_project` ( `product` varchar(200) COLLATE utf8_bin DEFAULT NULL, `application_area` tinyint(1) DEFAULT NULL, `business` varchar(200) COLLATE utf8_bin DEFAULT NULL, + `is_personal` tinyint(4) NOT NULL DEFAULT '0', + `create_by_str` varchar(256) COLLATE utf8_bin DEFAULT NULL, + `update_by_str` varchar(256) COLLATE utf8_bin DEFAULT NULL, + `dev_process` varchar(200) COLLATE utf8_bin DEFAULT NULL COMMENT '开发流程,多个以英文逗号分隔,取得的值是dss_dictionary中的dic_key(parent_key=p_develop_process)', + `orchestrator_mode` varchar(200) COLLATE utf8_bin DEFAULT NULL COMMENT '编排模式,多个以英文逗号分隔,取得的值是dss_dictionary中的dic_key(parent_key=p_arrangement_mode或下面一级)', + `visible` tinyint(4) DEFAULT '1' COMMENT '0:已删除;1:未删除(默认)', PRIMARY KEY (`id`) -) ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_bin ROW_FORMAT=COMPACT; - +) ENGINE=InnoDB AUTO_INCREMENT=313 DEFAULT CHARSET=utf8 COLLATE=utf8_bin ROW_FORMAT=COMPACT; --- ---------------------------- --- Table structure for dss_project_applications_project --- ---------------------------- DROP TABLE IF EXISTS `dss_project_applications_project`; CREATE TABLE `dss_project_applications_project` ( `project_id` bigint(20) NOT NULL, @@ -184,16 +672,33 @@ CREATE TABLE `dss_project_applications_project` ( `application_project_id` bigint(20) NOT NULL ) ENGINE=InnoDB DEFAULT CHARSET=utf8; +DROP TABLE IF EXISTS `dss_project_orchestrator`; +CREATE TABLE `dss_project_orchestrator` ( + `id` int(11) NOT NULL AUTO_INCREMENT COMMENT '主键ID', + `workspace_id` int(11) DEFAULT NULL COMMENT '空间id', + `project_id` int(11) DEFAULT NULL COMMENT '工程id', + `orchestrator_id` int(11) DEFAULT NULL COMMENT '编排模式id(工作流,调用orchestrator服务返回的orchestratorId)', + `orchestrator_version_id` int(11) DEFAULT NULL COMMENT '编排模式版本id(工作流,调用orchestrator服务返回的orchestratorVersionId)', + `orchestrator_name` varchar(100) DEFAULT NULL COMMENT '编排名称', + `orchestrator_mode` varchar(100) DEFAULT NULL COMMENT '编排模式,取得的值是dss_dictionary中的dic_key(parent_key=p_arrangement_mode)', + `orchestrator_way` varchar(256) DEFAULT NULL COMMENT '编排方式', + `uses` varchar(256) DEFAULT NULL COMMENT '用途', + `description` varchar(256) DEFAULT NULL COMMENT '描述', + `create_user` varchar(100) DEFAULT NULL COMMENT '创建人', + `create_time` datetime DEFAULT CURRENT_TIMESTAMP COMMENT '创建时间', + `update_user` varchar(100) DEFAULT NULL COMMENT '更新人', + `update_time` datetime DEFAULT CURRENT_TIMESTAMP COMMENT '更新时间', + PRIMARY KEY (`id`), + KEY `idx_workspace_id` (`workspace_id`,`project_id`), + KEY `idx_orchestrator_id` (`orchestrator_id`) +) ENGINE=InnoDB AUTO_INCREMENT=197 DEFAULT CHARSET=utf8 COMMENT='DSS编排模式信息表'; --- ---------------------------- --- Table structure for dss_project_publish_history --- ---------------------------- DROP TABLE IF EXISTS `dss_project_publish_history`; CREATE TABLE `dss_project_publish_history` ( `id` bigint(20) NOT NULL AUTO_INCREMENT, `project_version_id` bigint(20) DEFAULT NULL, `create_time` datetime DEFAULT NULL, - `creator_id` bigint(20) DEFAULT NULL, + `creator` varchar(32) COLLATE utf8_bin DEFAULT NULL, `update_time` datetime DEFAULT NULL, `comment` varchar(255) COLLATE utf8_bin DEFAULT NULL, `state` tinyint(255) DEFAULT NULL, @@ -201,37 +706,38 @@ CREATE TABLE `dss_project_publish_history` ( `expire_time` datetime DEFAULT NULL, PRIMARY KEY (`id`) USING BTREE, UNIQUE KEY `project_version_id` (`project_version_id`) USING BTREE -) ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_bin ROW_FORMAT=COMPACT; +) ENGINE=InnoDB AUTO_INCREMENT=102 DEFAULT CHARSET=utf8 COLLATE=utf8_bin ROW_FORMAT=COMPACT; --- ---------------------------- --- Table structure for dss_project_taxonomy --- ---------------------------- DROP TABLE IF EXISTS `dss_project_taxonomy`; CREATE TABLE `dss_project_taxonomy` ( `id` bigint(20) NOT NULL AUTO_INCREMENT, `name` varchar(20) DEFAULT NULL, `description` varchar(255) DEFAULT NULL, - `creator_id` int(11) DEFAULT NULL, + `creator` varchar(32) DEFAULT NULL, `create_time` datetime DEFAULT NULL, `update_time` datetime DEFAULT NULL, PRIMARY KEY (`id`) USING BTREE, UNIQUE KEY `name` (`name`) USING BTREE -) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 ROW_FORMAT=COMPACT; +) ENGINE=InnoDB AUTO_INCREMENT=2 DEFAULT CHARSET=utf8mb4 ROW_FORMAT=COMPACT; --- ---------------------------- --- Table structure for dss_project_taxonomy_relation --- ---------------------------- DROP TABLE IF EXISTS `dss_project_taxonomy_relation`; CREATE TABLE `dss_project_taxonomy_relation` ( `taxonomy_id` bigint(20) NOT NULL, `project_id` bigint(20) NOT NULL, - `creator_id` bigint(11) NOT NULL + `creator` varchar(32) NOT NULL ) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 ROW_FORMAT=COMPACT; +DROP TABLE IF EXISTS `dss_project_user`; +CREATE TABLE `dss_project_user` ( + `id` bigint(20) NOT NULL AUTO_INCREMENT, + `project_id` int(10) NOT NULL, + `username` varchar(32) COLLATE utf8_bin DEFAULT NULL, + `workspace_id` bigint(20) DEFAULT NULL, + `priv` int(20) DEFAULT NULL, + `last_update_time` datetime DEFAULT NULL, + PRIMARY KEY (`id`) +) ENGINE=InnoDB AUTO_INCREMENT=1859 DEFAULT CHARSET=utf8 COLLATE=utf8_bin ROW_FORMAT=COMPACT; --- ---------------------------- --- Table structure for dss_project_version --- ---------------------------- DROP TABLE IF EXISTS `dss_project_version`; CREATE TABLE `dss_project_version` ( `id` bigint(20) NOT NULL AUTO_INCREMENT, @@ -239,212 +745,386 @@ CREATE TABLE `dss_project_version` ( `version` varchar(10) COLLATE utf8_bin DEFAULT NULL, `comment` varchar(255) COLLATE utf8_bin DEFAULT NULL, `update_time` datetime DEFAULT NULL, - `updator_id` int(11) DEFAULT NULL, + `updator` varchar(32) COLLATE utf8_bin DEFAULT NULL, `lock` int(255) DEFAULT NULL, PRIMARY KEY (`id`) USING BTREE -) ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_bin ROW_FORMAT=COMPACT; +) ENGINE=InnoDB AUTO_INCREMENT=773 DEFAULT CHARSET=utf8 COLLATE=utf8_bin ROW_FORMAT=COMPACT; + +DROP TABLE IF EXISTS `dss_release_task`; +CREATE TABLE `dss_release_task` ( + `id` bigint(20) NOT NULL AUTO_INCREMENT, + `project_id` bigint(20) NOT NULL, + `orchestrator_version_id` bigint(20) NOT NULL, + `orchestrator_id` bigint(20) NOT NULL, + `release_user` varchar(128) NOT NULL, + `create_time` datetime DEFAULT NULL, + `update_time` datetime DEFAULT NULL, + `status` varchar(64) DEFAULT 'init', + PRIMARY KEY (`id`) USING BTREE +) ENGINE=InnoDB AUTO_INCREMENT=605 DEFAULT CHARSET=utf8mb4 ROW_FORMAT=COMPACT; + +DROP TABLE IF EXISTS `dss_role`; +CREATE TABLE `dss_role` ( + `id` int(20) NOT NULL AUTO_INCREMENT, + `workspace_id` varchar(255) DEFAULT NULL, + `name` varchar(255) DEFAULT NULL, + `front_name` varchar(255) DEFAULT NULL, + `update_time` datetime DEFAULT NULL, + `description` varchar(512) DEFAULT NULL, + PRIMARY KEY (`id`), + UNIQUE KEY `workspace_id` (`workspace_id`,`name`) +) ENGINE=InnoDB AUTO_INCREMENT=10 DEFAULT CHARSET=utf8 CHECKSUM=1 DELAY_KEY_WRITE=1 ROW_FORMAT=DYNAMIC; + +DROP TABLE IF EXISTS `dss_sidebar`; +CREATE TABLE `dss_sidebar` ( + `id` int(11) NOT NULL AUTO_INCREMENT COMMENT '主键ID', + `workspace_id` int(11) DEFAULT '0' COMMENT '空间ID,默认为0,所有空间都有', + `name` varchar(200) DEFAULT NULL COMMENT '名称', + `name_en` varchar(400) DEFAULT NULL COMMENT '名称(英文)', + `title` varchar(200) DEFAULT NULL COMMENT '标题', + `title_en` varchar(400) DEFAULT NULL COMMENT '标题(英文)', + `type` int(1) NOT NULL COMMENT '类型: 0-知识库,1-菜单,2-常见问题', + `order_num` int(2) DEFAULT '1' COMMENT '序号,按照这个字段顺序显示', + `remark` varchar(200) DEFAULT NULL COMMENT '备注', + `create_user` varchar(100) DEFAULT NULL COMMENT '创建人', + `create_time` datetime DEFAULT CURRENT_TIMESTAMP COMMENT '创建时间', + `update_user` varchar(100) DEFAULT NULL COMMENT '更新人', + `update_time` datetime DEFAULT CURRENT_TIMESTAMP COMMENT '更新时间', + PRIMARY KEY (`id`), + KEY `idx_workspace_id` (`workspace_id`) +) ENGINE=InnoDB AUTO_INCREMENT=4 DEFAULT CHARSET=utf8 COMMENT='侧边栏表'; + +DROP TABLE IF EXISTS `dss_sidebar_content`; +CREATE TABLE `dss_sidebar_content` ( + `id` int(11) NOT NULL AUTO_INCREMENT COMMENT '主键ID', + `workspace_id` int(11) DEFAULT '0' COMMENT '空间ID,默认为0,所有空间都有', + `sidebar_id` int(11) NOT NULL COMMENT '侧边栏ID', + `name` varchar(200) DEFAULT NULL COMMENT '名称', + `name_en` varchar(400) DEFAULT NULL COMMENT '名称(英文)', + `title` varchar(200) DEFAULT NULL COMMENT '标题', + `title_en` varchar(400) DEFAULT NULL COMMENT '标题(英文)', + `url` varchar(200) DEFAULT NULL COMMENT 'url', + `url_type` int(1) DEFAULT '0' COMMENT 'url类型: 0-内部系统,1-外部系统;默认是内部', + `icon` varchar(200) DEFAULT NULL COMMENT '图标', + `order_num` int(2) DEFAULT '1' COMMENT '序号,按照这个字段顺序显示', + `remark` varchar(200) DEFAULT NULL COMMENT '备注', + `create_user` varchar(100) DEFAULT NULL COMMENT '创建人', + `create_time` datetime DEFAULT CURRENT_TIMESTAMP COMMENT '创建时间', + `update_user` varchar(100) DEFAULT NULL COMMENT '更新人', + `update_time` datetime DEFAULT CURRENT_TIMESTAMP COMMENT '更新时间', + PRIMARY KEY (`id`), + KEY `idx_sidebarws_id` (`workspace_id`,`sidebar_id`) +) ENGINE=InnoDB AUTO_INCREMENT=10 DEFAULT CHARSET=utf8 COMMENT='侧边栏-内容表'; --- ---------------------------- --- Table structure for dss_user --- ---------------------------- DROP TABLE IF EXISTS `dss_user`; CREATE TABLE `dss_user` ( - `id` int(11) NOT NULL, + `id` int(11) NOT NULL AUTO_INCREMENT, `username` varchar(64) DEFAULT NULL, `name` varchar(64) DEFAULT NULL, - `is_first_login` tinyint(1) DEFAULT NULL + `is_first_login` tinyint(1) DEFAULT NULL, + `is_admin` tinyint(1) DEFAULT '1', + PRIMARY KEY (`id`) +) ENGINE=InnoDB AUTO_INCREMENT=214 DEFAULT CHARSET=utf8; + +DROP TABLE IF EXISTS `dss_workflow`; +CREATE TABLE `dss_workflow` ( + `id` bigint(20) NOT NULL AUTO_INCREMENT, + `name` varchar(128) DEFAULT NULL, + `state` tinyint(1) DEFAULT NULL, + `source` varchar(255) DEFAULT NULL, + `description` varchar(255) DEFAULT NULL, + `create_time` datetime DEFAULT NULL, + `creator` varchar(32) DEFAULT NULL, + `is_root_flow` tinyint(1) DEFAULT NULL, + `rank` int(10) DEFAULT NULL, + `project_id` bigint(20) DEFAULT NULL, + `has_saved` tinyint(1) DEFAULT NULL, + `uses` varchar(255) DEFAULT NULL, + `bml_version` varchar(255) DEFAULT NULL, + `resource_id` varchar(255) DEFAULT NULL, + `linked_appconn_names` varchar(255) DEFAULT NULL, + `dss_labels` varchar(255) DEFAULT NULL, + PRIMARY KEY (`id`) USING BTREE +) ENGINE=InnoDB AUTO_INCREMENT=455 DEFAULT CHARSET=utf8mb4 ROW_FORMAT=COMPACT; + +DROP TABLE IF EXISTS `dss_workflow_node`; +CREATE TABLE `dss_workflow_node` ( + `id` int(11) NOT NULL AUTO_INCREMENT, + `name` varchar(16) DEFAULT NULL, + `appconn_name` varchar(64) DEFAULT '-1' COMMENT 'appconn的名称,与dss_appconn这表的appconn_name名称对应', + `node_type` varchar(255) DEFAULT NULL, + `jump_url` varchar(255) DEFAULT NULL, + `support_jump` tinyint(1) DEFAULT NULL, + `submit_to_scheduler` tinyint(1) DEFAULT NULL, + `enable_copy` tinyint(1) DEFAULT NULL, + `should_creation_before_node` tinyint(1) DEFAULT NULL, + `icon` longtext, + PRIMARY KEY (`id`) + ) ENGINE=InnoDB AUTO_INCREMENT=21 DEFAULT CHARSET=utf8; + +DROP TABLE IF EXISTS `dss_workflow_node_group`; +CREATE TABLE `dss_workflow_node_group` ( + `id` int(11) NOT NULL AUTO_INCREMENT, + `name` varchar(32) NOT NULL, + `name_en` varchar(32) NOT NULL, + `description` varchar(255) DEFAULT NULL, + `order` tinyint(2) NOT NULL, + PRIMARY KEY (`id`) +) ENGINE=InnoDB AUTO_INCREMENT=9 DEFAULT CHARSET=utf8; + +DROP TABLE IF EXISTS `dss_workflow_node_to_group`; +CREATE TABLE `dss_workflow_node_to_group` ( + `node_id` int(11) NOT NULL, + `group_id` int(11) NOT NULL ) ENGINE=InnoDB DEFAULT CHARSET=utf8; --- ---------------------------- --- Table structure for event_auth --- ---------------------------- -DROP TABLE IF EXISTS `event_auth`; -CREATE TABLE `event_auth` ( - `sender` varchar(45) NOT NULL COMMENT '消息发送者', - `topic` varchar(45) NOT NULL COMMENT '消息主题', - `msg_name` varchar(45) NOT NULL COMMENT '消息名称', - `record_time` datetime NOT NULL DEFAULT CURRENT_TIMESTAMP COMMENT '插入记录时间', - `allow_send` int(11) NOT NULL COMMENT '允许发送标志', - PRIMARY KEY (`sender`,`topic`,`msg_name`) -) ENGINE=InnoDB DEFAULT CHARSET=utf8 COMMENT='消息发送授权表'; +DROP TABLE IF EXISTS `dss_workflow_node_to_ui`; +CREATE TABLE `dss_workflow_node_to_ui` ( + `workflow_node_id` int(11) NOT NULL, + `ui_id` int(11) NOT NULL +) ENGINE=InnoDB DEFAULT CHARSET=utf8; --- ---------------------------- --- Table structure for event_queue --- ---------------------------- -DROP TABLE IF EXISTS `event_queue`; -CREATE TABLE `event_queue` ( - `msg_id` int(10) unsigned NOT NULL AUTO_INCREMENT COMMENT '消息ID号', - `sender` varchar(45) NOT NULL COMMENT '消息发送者', - `send_time` datetime NOT NULL COMMENT '消息发送时间', - `topic` varchar(45) NOT NULL COMMENT '消息主题', - `msg_name` varchar(45) NOT NULL COMMENT '消息名称', - `msg` varchar(250) DEFAULT NULL COMMENT '消息内容', - `send_ip` varchar(45) NOT NULL, - PRIMARY KEY (`msg_id`) -) ENGINE=InnoDB AUTO_INCREMENT=154465 DEFAULT CHARSET=utf8 COMMENT='azkaban调取系统消息队列表'; --- ---------------------------- --- Table structure for event_status --- ---------------------------- -DROP TABLE IF EXISTS `event_status`; -CREATE TABLE `event_status` ( - `receiver` varchar(45) NOT NULL COMMENT '消息接收者', - `receive_time` datetime NOT NULL COMMENT '消息接收时间', - `topic` varchar(45) NOT NULL COMMENT '消息主题', - `msg_name` varchar(45) NOT NULL COMMENT '消息名称', - `msg_id` int(11) NOT NULL COMMENT '消息的最大消费id', - PRIMARY KEY (`receiver`,`topic`,`msg_name`) -) ENGINE=InnoDB DEFAULT CHARSET=utf8 COMMENT='消息消费状态表'; +DROP TABLE IF EXISTS `dss_workflow_node_ui`; +CREATE TABLE `dss_workflow_node_ui` ( + `id` int(11) NOT NULL AUTO_INCREMENT, + `key` varchar(64) NOT NULL, + `description` varchar(255) DEFAULT NULL, + `description_en` varchar(255) DEFAULT NULL, + `lable_name` varchar(64) NOT NULL, + `lable_name_en` varchar(64) NOT NULL, + `ui_type` varchar(16) NOT NULL, + `required` tinyint(1) NOT NULL, + `value` varchar(255) DEFAULT NULL, + `default_value` varchar(255) DEFAULT NULL, + `is_hidden` tinyint(1) NOT NULL, + `condition` varchar(255) DEFAULT NULL, + `is_advanced` tinyint(1) NOT NULL, + `order` tinyint(2) NOT NULL, + `node_menu_type` tinyint(1) NOT NULL, + `is_base_info` tinyint(1) NOT NULL, + `position` varchar(32) NOT NULL, + PRIMARY KEY (`id`) +) ENGINE=InnoDB AUTO_INCREMENT=45 DEFAULT CHARSET=utf8; + +DROP TABLE IF EXISTS `dss_workflow_node_ui_to_validate`; +CREATE TABLE `dss_workflow_node_ui_to_validate` ( + `ui_id` int(11) NOT NULL, + `validate_id` int(11) NOT NULL +) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4; + +DROP TABLE IF EXISTS `dss_workflow_node_ui_validate`; +CREATE TABLE `dss_workflow_node_ui_validate` ( + `id` int(11) NOT NULL AUTO_INCREMENT, + `validate_type` varchar(16) NOT NULL, + `validate_range` varchar(255) DEFAULT NULL, + `error_msg` varchar(255) DEFAULT NULL, + `error_msg_en` varchar(255) DEFAULT NULL, + `trigger` varchar(16) DEFAULT NULL, + PRIMARY KEY (`id`) +) ENGINE=InnoDB AUTO_INCREMENT=59 DEFAULT CHARSET=utf8mb4; + +DROP TABLE IF EXISTS `dss_workflow_project`; +CREATE TABLE `dss_workflow_project` ( + `id` bigint(20) NOT NULL AUTO_INCREMENT, + `name` varchar(200) COLLATE utf8_bin DEFAULT NULL, + `source` varchar(50) COLLATE utf8_bin DEFAULT NULL COMMENT 'Source of the dss_project', + `description` text COLLATE utf8_bin, + `user_id` bigint(20) DEFAULT NULL, + `create_time` datetime DEFAULT NULL, + `create_by` bigint(20) DEFAULT NULL, + `update_time` datetime DEFAULT NULL, + `update_by` datetime DEFAULT NULL, + `org_id` bigint(20) DEFAULT NULL COMMENT 'Organization ID', + `visibility` bit(1) DEFAULT NULL, + `is_transfer` bit(1) DEFAULT NULL COMMENT 'Reserved word', + `initial_org_id` bigint(20) DEFAULT NULL, + `isArchive` bit(1) DEFAULT b'0' COMMENT 'If it is archived', + `pic` varchar(255) COLLATE utf8_bin DEFAULT NULL, + `star_num` int(11) DEFAULT '0', + `product` varchar(200) COLLATE utf8_bin DEFAULT NULL, + `application_area` tinyint(1) DEFAULT NULL, + `business` varchar(200) COLLATE utf8_bin DEFAULT NULL, + PRIMARY KEY (`id`) +) ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_bin ROW_FORMAT=COMPACT; +DROP TABLE IF EXISTS `dss_workflow_project_priv`; +CREATE TABLE `dss_workflow_project_priv` ( + `id` bigint(20) NOT NULL AUTO_INCREMENT, + `workapce_id` bigint(20) DEFAULT NULL, + `project_id` bigint(20) DEFAULT NULL, + `user_id` bigint(20) DEFAULT NULL, + `priv` varchar(255) DEFAULT NULL, + PRIMARY KEY (`id`) +) ENGINE=InnoDB DEFAULT CHARSET=utf8; + +DROP TABLE IF EXISTS `dss_workflow_task`; +CREATE TABLE `dss_workflow_task` ( + `id` bigint(20) NOT NULL AUTO_INCREMENT COMMENT 'Primary Key, auto increment', + `instance` varchar(50) DEFAULT NULL COMMENT 'An instance of Entrance, consists of IP address of the entrance server and port', + `exec_id` varchar(50) DEFAULT NULL COMMENT 'execution ID, consists of jobID(generated by scheduler), executeApplicationName , creator and instance', + `um_user` varchar(50) DEFAULT NULL COMMENT 'User name', + `submit_user` varchar(50) DEFAULT NULL COMMENT 'submitUser name', + `execution_code` text COMMENT 'Run script. When exceeding 6000 lines, script would be stored in HDFS and its file path would be stored in database', + `progress` float DEFAULT NULL COMMENT 'Script execution progress, between zero and one', + `log_path` varchar(200) DEFAULT NULL COMMENT 'File path of the log files', + `result_location` varchar(200) DEFAULT NULL COMMENT 'File path of the result', + `status` varchar(50) DEFAULT NULL COMMENT 'Script execution status, must be one of the following: Inited, WaitForRetry, Scheduled, Running, Succeed, Failed, Cancelled, Timeout', + `created_time` datetime DEFAULT NULL COMMENT 'Creation time', + `updated_time` datetime DEFAULT NULL COMMENT 'Update time', + `run_type` varchar(50) DEFAULT NULL COMMENT 'Further refinement of execution_application_time, e.g, specifying whether to run pySpark or SparkR', + `err_code` int(11) DEFAULT NULL COMMENT 'Error code. Generated when the execution of the script fails', + `err_desc` text COMMENT 'Execution description. Generated when the execution of script fails', + `execute_application_name` varchar(200) DEFAULT NULL COMMENT 'The service a user selects, e.g, Spark, Python, R, etc', + `request_application_name` varchar(200) DEFAULT NULL COMMENT 'Parameter name for creator', + `script_path` varchar(200) DEFAULT NULL COMMENT 'Path of the script in workspace', + `params` text COMMENT 'Configuration item of the parameters', + `engine_instance` varchar(50) DEFAULT NULL COMMENT 'An instance of engine, consists of IP address of the engine server and port', + `task_resource` varchar(1024) DEFAULT NULL, + `engine_start_time` time DEFAULT NULL, + `label_json` varchar(200) DEFAULT NULL COMMENT 'label json', + PRIMARY KEY (`id`), + KEY `created_time` (`created_time`), + KEY `um_user` (`um_user`) +) ENGINE=InnoDB AUTO_INCREMENT=715 DEFAULT CHARSET=utf8mb4; --- ---------------------------- --- Table structure for dss_workspace --- ---------------------------- DROP TABLE IF EXISTS `dss_workspace`; CREATE TABLE `dss_workspace` ( `id` bigint(20) NOT NULL AUTO_INCREMENT, `name` varchar(255) DEFAULT NULL, `label` varchar(255) DEFAULT NULL, `description` varchar(255) DEFAULT NULL, + `create_by` varchar(255) DEFAULT NULL, + `create_time` datetime DEFAULT NULL, `department` varchar(255) DEFAULT NULL, `product` varchar(255) DEFAULT NULL, `source` varchar(255) DEFAULT NULL, - `create_by` varchar(255) DEFAULT NULL, - `create_time` datetime DEFAULT NULL, `last_update_time` datetime DEFAULT NULL, - `last_update_user` varchar(30) DEFAULT NULL, + `last_update_user` varchar(30) DEFAULT NULL COMMENT '最新修改用户', PRIMARY KEY (`id`), UNIQUE KEY `name` (`name`) -) ENGINE=InnoDB DEFAULT CHARSET=utf8; +) ENGINE=InnoDB AUTO_INCREMENT=224 DEFAULT CHARSET=utf8; --- ---------------------------- --- Table structure for dss_onestop_menu --- ---------------------------- -DROP TABLE IF EXISTS `dss_onestop_menu`; -CREATE TABLE `dss_onestop_menu` ( +DROP TABLE IF EXISTS `dss_workspace_datasource`; +CREATE TABLE `dss_workspace_datasource` ( `id` int(20) NOT NULL AUTO_INCREMENT, - `name` varchar(64) DEFAULT NULL, - `title_en` varchar(64) DEFAULT NULL, - `title_cn` varchar(64) DEFAULT NULL, - `description` varchar(255) DEFAULT NULL, - `is_active` tinyint(1) DEFAULT 1, - `icon` varchar(255) DEFAULT NULL, - `order` int(2) DEFAULT NULL, - `create_by` varchar(255) DEFAULT NULL, - `create_time` datetime DEFAULT NULL, + `workspace_id` int(20) DEFAULT NULL, + `datasource_name` varchar(255) DEFAULT NULL, + `type` varchar(255) DEFAULT NULL, + `created_time` datetime DEFAULT NULL, + `env` varchar(255) DEFAULT NULL, + `creater` varchar(255) DEFAULT NULL, + `responser` varchar(255) DEFAULT NULL, + `last_update_user` datetime DEFAULT NULL, `last_update_time` datetime DEFAULT NULL, - `last_update_user` varchar(30) DEFAULT NULL, PRIMARY KEY (`id`) ) ENGINE=InnoDB DEFAULT CHARSET=utf8; --- ---------------------------- --- Table structure for dss_onestop_menu_application --- ---------------------------- -DROP TABLE IF EXISTS `dss_onestop_menu_application`; -CREATE TABLE `dss_onestop_menu_application` ( +DROP TABLE IF EXISTS `dss_workspace_homepage`; +CREATE TABLE `dss_workspace_homepage` ( `id` int(20) NOT NULL AUTO_INCREMENT, - `application_id` int(20) DEFAULT NULL, - `onestop_menu_id` int(20) NOT NULL, - `title_en` varchar(64) DEFAULT NULL, - `title_cn` varchar(64) DEFAULT NULL, - `desc_en` varchar(255) DEFAULT NULL, - `desc_cn` varchar(255) DEFAULT NULL, - `labels_en` varchar(255) DEFAULT NULL, - `labels_cn` varchar(255) DEFAULT NULL, - `is_active` tinyint(1) DEFAULT NULL, - `access_button_en` varchar(64) DEFAULT NULL, - `access_button_cn` varchar(64) DEFAULT NULL, - `manual_button_en` varchar(64) DEFAULT NULL, - `manual_button_cn` varchar(64) DEFAULT NULL, - `manual_button_url` varchar(255) DEFAULT NULL, - `icon` varchar(255) DEFAULT NULL, - `order` int(2) DEFAULT NULL, - `create_by` varchar(255) DEFAULT NULL, - `create_time` datetime DEFAULT NULL, - `last_update_time` datetime DEFAULT NULL, - `last_update_user` varchar(30) DEFAULT NULL, + `workspace_id` int(10) NOT NULL, + `role_id` int(20) DEFAULT NULL, + `homepage_url` varchar(256) DEFAULT NULL, + `update_time` datetime DEFAULT NULL, PRIMARY KEY (`id`) -) ENGINE=InnoDB DEFAULT CHARSET=utf8; +) ENGINE=InnoDB AUTO_INCREMENT=1213 DEFAULT CHARSET=utf8; --- ---------------------------- --- Table structure for dss_onestop_user_favorites --- ---------------------------- -DROP TABLE IF EXISTS `dss_onestop_user_favorites`; -CREATE TABLE `dss_onestop_user_favorites` ( +DROP TABLE IF EXISTS `dss_workspace_public_table`; +CREATE TABLE `dss_workspace_public_table` ( `id` int(20) NOT NULL AUTO_INCREMENT, - `username` varchar(64) DEFAULT NULL, - `workspace_id` bigint(20) DEFAULT 1, - `menu_application_id` int(20) DEFAULT NULL, - `order` int(2) DEFAULT NULL, - `create_by` varchar(255) DEFAULT NULL, - `create_time` datetime DEFAULT NULL, + `worksapce_id` int(20) DEFAULT NULL, + `table_name` varchar(255) DEFAULT NULL, + `type` varchar(255) DEFAULT NULL, + `created_time` datetime DEFAULT NULL, + `env` varchar(255) DEFAULT NULL, + `creater` varchar(255) DEFAULT NULL, + `responser` varchar(255) DEFAULT NULL, + `last_update_user` datetime DEFAULT NULL, `last_update_time` datetime DEFAULT NULL, - `last_update_user` varchar(30) DEFAULT NULL, PRIMARY KEY (`id`) ) ENGINE=InnoDB DEFAULT CHARSET=utf8; --- ---------------------------- --- Table structure for dss_homepage_demo_menu --- ---------------------------- -DROP TABLE IF EXISTS `dss_homepage_demo_menu`; -CREATE TABLE `dss_homepage_demo_menu` ( +DROP TABLE IF EXISTS `dss_workspace_role`; +CREATE TABLE `dss_workspace_role` ( `id` int(20) NOT NULL AUTO_INCREMENT, - `name` varchar(64) DEFAULT NULL, - `title_en` varchar(64) DEFAULT NULL, - `title_cn` varchar(64) DEFAULT NULL, - `description` varchar(255) DEFAULT NULL, - `is_active` tinyint(1) DEFAULT 1, - `icon` varchar(255) DEFAULT NULL, - `order` int(2) DEFAULT NULL, - `create_by` varchar(255) DEFAULT NULL, - `create_time` datetime DEFAULT NULL, - `last_update_time` datetime DEFAULT NULL, - `last_update_user` varchar(30) DEFAULT NULL, + `workspace_id` int(20) DEFAULT NULL, + `role_id` int(20) DEFAULT NULL, PRIMARY KEY (`id`) ) ENGINE=InnoDB DEFAULT CHARSET=utf8; --- ---------------------------- --- Table structure for dss_homepage_demo_instance --- ---------------------------- -DROP TABLE IF EXISTS `dss_homepage_demo_instance`; -CREATE TABLE `dss_homepage_demo_instance` ( - `id` int(20) NOT NULL AUTO_INCREMENT, - `menu_id` int(20) DEFAULT NULL, - `name` varchar(64) DEFAULT NULL, - `url` varchar(128) DEFAULT NULL, - `title_en` varchar(64) DEFAULT NULL, - `title_cn` varchar(64) DEFAULT NULL, - `description` varchar(255) DEFAULT NULL, - `is_active` tinyint(1) DEFAULT 1, - `icon` varchar(255) DEFAULT NULL, - `order` int(2) DEFAULT NULL, - `click_num` int(11) DEFAULT 0, - `create_by` varchar(255) DEFAULT NULL, +DROP TABLE IF EXISTS `dss_workspace_user`; +CREATE TABLE `dss_workspace_user` ( + `id` bigint(20) NOT NULL AUTO_INCREMENT, + `workspace_id` bigint(20) DEFAULT NULL, + `username` varchar(32) DEFAULT NULL, + `join_time` datetime DEFAULT NULL, + `created_by` varchar(255) DEFAULT NULL, + `user_id` bigint(20) DEFAULT NULL, + PRIMARY KEY (`id`), + UNIQUE KEY `workspace_id` (`workspace_id`,`username`) +) ENGINE=InnoDB AUTO_INCREMENT=1 DEFAULT CHARSET=utf8 comment '空间用户表'; + +DROP TABLE IF EXISTS `dss_workspace_user_role`; +CREATE TABLE `dss_workspace_user_role` ( + `id` bigint(20) NOT NULL AUTO_INCREMENT, + `workspace_id` bigint(20) DEFAULT NULL, + `username` varchar(32) DEFAULT NULL, + `role_id` int(20) DEFAULT NULL, `create_time` datetime DEFAULT NULL, - `last_update_time` datetime DEFAULT NULL, - `last_update_user` varchar(30) DEFAULT NULL, + `created_by` varchar(255) DEFAULT NULL, + `user_id` bigint(20) DEFAULT NULL, PRIMARY KEY (`id`) -) ENGINE=InnoDB DEFAULT CHARSET=utf8; +) ENGINE=InnoDB AUTO_INCREMENT=1 DEFAULT CHARSET=utf8 comment '空间用户角色关系表'; --- ---------------------------- --- Table structure for dss_homepage_video --- ---------------------------- -DROP TABLE IF EXISTS `dss_homepage_video`; -CREATE TABLE `dss_homepage_video` ( - `id` int(20) NOT NULL AUTO_INCREMENT, - `name` varchar(64) DEFAULT NULL, - `url` varchar(128) DEFAULT NULL, - `title_en` varchar(64) DEFAULT NULL, - `title_cn` varchar(64) DEFAULT NULL, +DROP TABLE IF EXISTS `event_auth`; +CREATE TABLE `event_auth` ( + `sender` varchar(45) NOT NULL COMMENT '消息发送者', + `topic` varchar(45) NOT NULL COMMENT '消息主题', + `msg_name` varchar(45) NOT NULL COMMENT '消息名称', + `record_time` datetime NOT NULL DEFAULT CURRENT_TIMESTAMP COMMENT '插入记录时间', + `allow_send` int(11) NOT NULL COMMENT '允许发送标志', + PRIMARY KEY (`sender`,`topic`,`msg_name`) +) ENGINE=InnoDB DEFAULT CHARSET=utf8 COMMENT='消息发送授权表'; + +DROP TABLE IF EXISTS `event_queue`; +CREATE TABLE `event_queue` ( + `msg_id` int(10) unsigned NOT NULL AUTO_INCREMENT COMMENT '消息ID号', + `sender` varchar(45) NOT NULL COMMENT '消息发送者', + `send_time` datetime NOT NULL COMMENT '消息发送时间', + `topic` varchar(45) NOT NULL COMMENT '消息主题', + `msg_name` varchar(45) NOT NULL COMMENT '消息名称', + `msg` varchar(250) DEFAULT NULL COMMENT '消息内容', + `send_ip` varchar(45) NOT NULL, + PRIMARY KEY (`msg_id`) +) ENGINE=InnoDB AUTO_INCREMENT=21068 DEFAULT CHARSET=utf8 COMMENT='azkaban调取系统消息队列表'; + +DROP TABLE IF EXISTS `event_status`; +CREATE TABLE `event_status` ( + `receiver` varchar(45) NOT NULL COMMENT '消息接收者', + `receive_time` datetime NOT NULL COMMENT '消息接收时间', + `topic` varchar(45) NOT NULL COMMENT '消息主题', + `msg_name` varchar(45) NOT NULL COMMENT '消息名称', + `msg_id` int(11) NOT NULL COMMENT '消息的最大消费id', + PRIMARY KEY (`receiver`,`topic`,`msg_name`) +) ENGINE=InnoDB DEFAULT CHARSET=utf8 COMMENT='消息消费状态表'; + +DROP TABLE IF EXISTS `linkis_user`; +CREATE TABLE `linkis_user` ( + `id` bigint(20) NOT NULL AUTO_INCREMENT, + `email` varchar(255) DEFAULT NULL, + `username` varchar(255) NOT NULL, + `password` varchar(255) DEFAULT NULL, + `admin` tinyint(1) DEFAULT NULL COMMENT 'If it is an administrator', + `active` tinyint(1) DEFAULT NULL COMMENT 'If it is active', + `name` varchar(255) DEFAULT NULL COMMENT 'User name', `description` varchar(255) DEFAULT NULL, - `is_active` tinyint(1) DEFAULT 1, - `icon` varchar(255) DEFAULT NULL, - `order` int(2) DEFAULT NULL, - `play_num` int(11) DEFAULT 0, - `create_by` varchar(255) DEFAULT NULL, - `create_time` datetime DEFAULT NULL, - `last_update_time` datetime DEFAULT NULL, - `last_update_user` varchar(30) DEFAULT NULL, + `department` varchar(255) DEFAULT NULL, + `avatar` varchar(255) DEFAULT NULL COMMENT 'Path of the avator', + `create_time` timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP, + `create_by` bigint(20) DEFAULT '0', + `update_time` timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP, + `update_by` bigint(20) DEFAULT '0', + `is_first_login` bit(1) DEFAULT NULL COMMENT 'If it is the first time to log in', PRIMARY KEY (`id`) -) ENGINE=InnoDB DEFAULT CHARSET=utf8; - +) ENGINE=InnoDB DEFAULT CHARSET=utf8; \ No newline at end of file diff --git a/db/dss_dml.sql b/db/dss_dml.sql index 78bf09bd4..3e781a62a 100644 --- a/db/dss_dml.sql +++ b/db/dss_dml.sql @@ -1,108 +1,525 @@ -INSERT INTO `dss_application` (`id`, `name`, `url`, `is_user_need_init`, `level`, `user_init_url`, `exists_project_service`, `project_url`, `enhance_json`, `if_iframe`, `homepage_url`, `redirect_url`) VALUES (NULL, 'linkis', null, '0', '1', NULL, '0', '/home', NULL, '0', '/home', NULL); -INSERT INTO `dss_application` (`id`, `name`, `url`, `is_user_need_init`, `level`, `user_init_url`, `exists_project_service`, `project_url`, `enhance_json`, `if_iframe`, `homepage_url`, `redirect_url`) VALUES (NULL, 'workflow', null, '0', '1', NULL, '0', '/workflow', NULL, '0', '/project', NULL); -INSERT INTO `dss_application` (`id`, `name`, `url`, `is_user_need_init`, `level`, `user_init_url`, `exists_project_service`, `project_url`, `enhance_json`, `if_iframe`, `homepage_url`, `redirect_url`) VALUES (NULL, 'console', null, '0', '1', NULL, '0', '/console', NULL, '0', '/console', NULL); - -SELECT @linkis_appid:=id from dss_application WHERE `name` = 'linkis'; -SELECT @workflow_appid:=id from dss_application WHERE `name` = 'workflow'; -INSERT INTO `dss_workflow_node` (`id`, `icon`, `node_type`, `application_id`, `submit_to_scheduler`, `enable_copy`, `should_creation_before_node`, `support_jump`, `jump_url`) VALUES (NULL, NULL, 'linkis.python.python', @linkis_appid, '1', '1', '0', '1', NULL); -INSERT INTO `dss_workflow_node` (`id`, `icon`, `node_type`, `application_id`, `submit_to_scheduler`, `enable_copy`, `should_creation_before_node`, `support_jump`, `jump_url`) VALUES (NULL, NULL, 'linkis.spark.py', @linkis_appid, '1', '1', '0', '1', NULL); -INSERT INTO `dss_workflow_node` (`id`, `icon`, `node_type`, `application_id`, `submit_to_scheduler`, `enable_copy`, `should_creation_before_node`, `support_jump`, `jump_url`) VALUES (NULL, NULL, 'linkis.spark.sql', @linkis_appid, '1', '1', '0', '1', NULL); -INSERT INTO `dss_workflow_node` (`id`, `icon`, `node_type`, `application_id`, `submit_to_scheduler`, `enable_copy`, `should_creation_before_node`, `support_jump`, `jump_url`) VALUES (NULL, NULL, 'linkis.spark.scala', @linkis_appid, '1', '1', '0', '1', NULL); -INSERT INTO `dss_workflow_node` (`id`, `icon`, `node_type`, `application_id`, `submit_to_scheduler`, `enable_copy`, `should_creation_before_node`, `support_jump`, `jump_url`) VALUES (NULL, NULL, 'linkis.hive.hql', @linkis_appid, '1', '1', '0', '1', NULL); -INSERT INTO `dss_workflow_node` (`id`, `icon`, `node_type`, `application_id`, `submit_to_scheduler`, `enable_copy`, `should_creation_before_node`, `support_jump`, `jump_url`) VALUES (NULL, NULL, 'linkis.jdbc.jdbc', @linkis_appid, '1', '1', '0', '1', NULL); -INSERT INTO `dss_workflow_node` (`id`, `icon`, `node_type`, `application_id`, `submit_to_scheduler`, `enable_copy`, `should_creation_before_node`, `support_jump`, `jump_url`) VALUES (NULL, NULL, 'linkis.control.empty', @linkis_appid, '1', '1', '0', '0', NULL); -INSERT INTO `dss_workflow_node` (`id`, `icon`, `node_type`, `application_id`, `submit_to_scheduler`, `enable_copy`, `should_creation_before_node`, `support_jump`, `jump_url`) VALUES (NULL, NULL, 'linkis.appjoint.sendemail', @linkis_appid, '1', '1', '0', '0', NULL); -INSERT INTO `dss_workflow_node` (`id`, `icon`, `node_type`, `application_id`, `submit_to_scheduler`, `enable_copy`, `should_creation_before_node`, `support_jump`, `jump_url`) VALUES (NULL, NULL, 'linkis.appjoint.eventchecker.eventsender', @linkis_appid, '1', '1', '0', '0', NULL); -INSERT INTO `dss_workflow_node` (`id`, `icon`, `node_type`, `application_id`, `submit_to_scheduler`, `enable_copy`, `should_creation_before_node`, `support_jump`, `jump_url`) VALUES (NULL, NULL, 'linkis.appjoint.eventchecker.eventreceiver', @linkis_appid, '1', '1', '0', '0', NULL); -INSERT INTO `dss_workflow_node` (`id`, `icon`, `node_type`, `application_id`, `submit_to_scheduler`, `enable_copy`, `should_creation_before_node`, `support_jump`, `jump_url`) VALUES (NULL, NULL, 'linkis.appjoint.datachecker', @linkis_appid, '1', '1', '0', '0', NULL); -INSERT INTO `dss_workflow_node` (`id`, `icon`, `node_type`, `application_id`, `submit_to_scheduler`, `enable_copy`, `should_creation_before_node`, `support_jump`, `jump_url`) VALUES (NULL, NULL, 'workflow.subflow', @workflow_appid, '1', '0', '1', '1', NULL); - - - -INSERT INTO `dss_project_taxonomy` (`id`, `name`, `description`, `creator_id`, `create_time`, `update_time`) VALUES (NULL, 'My project', NULL, '-1', NULL, NULL); - -INSERT INTO `dss_flow_taxonomy` (`id`, `name`, `description`, `creator_id`, `create_time`, `update_time`, `project_id`) VALUES (NULL, 'My workflow', NULL, NULL, NULL,NULL, '-1'); - -UPDATE `dss_application` SET url = 'http://GATEWAY_INSTALL_IP_2:GATEWAY_PORT' WHERE `name` in('linkis','workflow'); - -INSERT INTO `linkis_config_key` (`id`, `key`, `description`, `name`, `application_id`, `default_value`, `validate_type`, `validate_range`, `is_hidden`, `is_advanced`, `level`) VALUES (NULL, 'spark.executor.instances', '取值范围:1-40,单位:个', '执行器实例最大并发数', @application_id, '2', 'NumInterval', '[1,40]', '0', '0', '2'); -INSERT INTO `linkis_config_key` (`id`, `key`, `description`, `name`, `application_id`, `default_value`, `validate_type`, `validate_range`, `is_hidden`, `is_advanced`, `level`) VALUES (NULL, 'spark.executor.cores', '取值范围:1-8,单位:个', '执行器核心个数', @application_id, '2', 'NumInterval', '[1,2]', '1', '0', '1'); -INSERT INTO `linkis_config_key` (`id`, `key`, `description`, `name`, `application_id`, `default_value`, `validate_type`, `validate_range`, `is_hidden`, `is_advanced`, `level`) VALUES (NULL, 'spark.executor.memory', '取值范围:3-15,单位:G', '执行器内存大小', @application_id, '3', 'NumInterval', '[3,15]', '0', '0', '3'); -INSERT INTO `linkis_config_key` (`id`, `key`, `description`, `name`, `application_id`, `default_value`, `validate_type`, `validate_range`, `is_hidden`, `is_advanced`, `level`) VALUES (NULL, 'spark.driver.cores', '取值范围:只能取1,单位:个', '驱动器核心个数', @application_id, '1', 'NumInterval', '[1,1]', '1', '1', '1'); -INSERT INTO `linkis_config_key` (`id`, `key`, `description`, `name`, `application_id`, `default_value`, `validate_type`, `validate_range`, `is_hidden`, `is_advanced`, `level`) VALUES (NULL, 'spark.driver.memory', '取值范围:1-15,单位:G', '驱动器内存大小', @application_id, '2', 'NumInterval', '[1,15]', '0', '0', '1'); -INSERT INTO `linkis_config_key` (`id`, `key`, `description`, `name`, `application_id`, `default_value`, `validate_type`, `validate_range`, `is_hidden`, `is_advanced`, `level`) VALUES (NULL, 'wds.linkis.instance', '范围:1-3,单位:个', 'spark引擎最大并发数', @application_id, '3', 'NumInterval', '[1,3]', '0', '0', '1'); - -select @key_id1:=id from `linkis_config_key` where `application_id` = @application_id and `key` = 'spark.executor.instances'; -select @key_id2:=id from `linkis_config_key` where `application_id` = @application_id and `key` = 'spark.executor.cores'; -select @key_id3:=id from `linkis_config_key` where `application_id` = @application_id and `key` = 'spark.executor.memory'; -select @key_id4:=id from `linkis_config_key` where `application_id` = @application_id and `key` = 'spark.driver.cores'; -select @key_id5:=id from `linkis_config_key` where `application_id` = @application_id and `key` = 'spark.driver.memory'; -select @key_id6:=id from `linkis_config_key` where `application_id` = @application_id and `key` = 'wds.linkis.instance'; - -SELECT @tree_id1:=t.id from linkis_config_tree t LEFT JOIN linkis_application a on t.application_id = a.id WHERE t.`name` = 'spark资源设置' and a.`name` = 'spark'; -SELECT @tree_id2:=t.id from linkis_config_tree t LEFT JOIN linkis_application a on t.application_id = a.id WHERE t.`name` = 'spark引擎设置' and a.`name` = 'spark'; - -insert into `linkis_config_key_tree` VALUES(NULL,@key_id1,@tree_id1); -insert into `linkis_config_key_tree` VALUES(NULL,@key_id2,@tree_id1); -insert into `linkis_config_key_tree` VALUES(NULL,@key_id3,@tree_id1); -insert into `linkis_config_key_tree` VALUES(NULL,@key_id4,@tree_id1); -insert into `linkis_config_key_tree` VALUES(NULL,@key_id5,@tree_id1); -insert into `linkis_config_key_tree` VALUES(NULL,@key_id6,@tree_id2); - -#-----------------------jdbc------------------- - -select @application_id:=id from `linkis_application` where `name` = 'nodeexecution'; -INSERT INTO `linkis_application` (`id`, `name`, `chinese_name`, `description`) SELECT NULL,'nodeexecution',`chinese_name`,`description` FROM linkis_application WHERE @application_id IS NULL LIMIT 1 ; -select @jdbc_id:=id from `linkis_application` where `name` = 'jdbc'; - -INSERT INTO `linkis_config_key` (`id`, `key`, `description`, `name`, `application_id`, `default_value`, `validate_type`, `validate_range`, `is_hidden`, `is_advanced`, `level`) VALUES (NULL, 'jdbc.url', '格式:', 'jdbc连接地址', @application_id, NULL , 'None', NULL , '0', '0', '1'); -INSERT INTO `linkis_config_key` (`id`, `key`, `description`, `name`, `application_id`, `default_value`, `validate_type`, `validate_range`, `is_hidden`, `is_advanced`, `level`) VALUES (NULL, 'jdbc.username', NULL , 'jdbc连接用户名', @application_id, NULL, 'None', NULL , '0', '0', '1'); -INSERT INTO `linkis_config_key` (`id`, `key`, `description`, `name`, `application_id`, `default_value`, `validate_type`, `validate_range`, `is_hidden`, `is_advanced`, `level`) VALUES (NULL, 'jdbc.password', NULL , 'jdbc连接密码', @application_id, NULL , 'None', NULL , '0', '0', '1'); - -select @key_id1:=id from `linkis_config_key` where `application_id` = @application_id and `key` = 'jdbc.url'; -select @key_id2:=id from `linkis_config_key` where `application_id` = @application_id and `key` = 'jdbc.username'; -select @key_id3:=id from `linkis_config_key` where `application_id` = @application_id and `key` = 'jdbc.password'; - -SELECT @tree_id1:=t.id from linkis_config_tree t LEFT JOIN linkis_application a on t.application_id = a.id WHERE t.`name` = 'jdbc连接设置' and a.`name` = 'jdbc'; - -insert into `linkis_config_key_tree` VALUES(NULL,@key_id1,@tree_id1); -insert into `linkis_config_key_tree` VALUES(NULL,@key_id2,@tree_id1); -insert into `linkis_config_key_tree` VALUES(NULL,@key_id3,@tree_id1); - -INSERT INTO dss_workspace (id, name, label, description, department, product, source, create_by, create_time, last_update_time, last_update_user) VALUES (1, 'default', 'default', 'default user workspace', NULL, NULL, 'create by user', 'root', NULL, NULL, 'root'); - -INSERT INTO dss_homepage_demo_instance (id, menu_id, name, url, title_en, title_cn, description, is_active, icon, `order`, click_num, create_by, create_time, last_update_time, last_update_user) VALUES (NULL, 1, '工作流编辑执行', 'https://github.com/WeBankFinTech/DataSphereStudio', 'workflow edit execution', '工作流编辑执行', '工作流编辑执行', 1, NULL, 1, 0, NULL, NULL, NULL, NULL); -INSERT INTO dss_homepage_demo_instance (id, menu_id, name, url, title_en, title_cn, description, is_active, icon, `order`, click_num, create_by, create_time, last_update_time, last_update_user) VALUES (NULL, 1, '工作流串联可视化', 'https://github.com/WeBankFinTech/DataSphereStudio', 'workflow series visualization', '工作流串联可视化', '工作流串联可视化', 1, NULL, 2, 0, NULL, NULL, NULL, NULL); -INSERT INTO dss_homepage_demo_instance (id, menu_id, name, url, title_en, title_cn, description, is_active, icon, `order`, click_num, create_by, create_time, last_update_time, last_update_user) VALUES (NULL, 1, '工作流调度执行跑批', 'https://github.com/WeBankFinTech/DataSphereStudio', 'workflow scheduling execution run batch', '工作流调度执行跑批', '工作流调度执行跑批', 1, NULL, 3, 0, NULL, NULL, NULL, NULL); -INSERT INTO dss_homepage_demo_instance (id, menu_id, name, url, title_en, title_cn, description, is_active, icon, `order`, click_num, create_by, create_time, last_update_time, last_update_user) VALUES (NULL, 2, '某业务日常运营报表', 'https://github.com/WeBankFinTech/DataSphereStudio', 'business daily operation report', '某业务日常运营报表', '某业务日常运营报表', 1, NULL, 1, 0, NULL, NULL, NULL, NULL); -INSERT INTO dss_homepage_demo_instance (id, menu_id, name, url, title_en, title_cn, description, is_active, icon, `order`, click_num, create_by, create_time, last_update_time, last_update_user) VALUES (NULL, 2, '某业务机器学习建模预测', 'https://github.com/WeBankFinTech/DataSphereStudio', 'business machine learning modeling prediction', '某业务机器学习建模预测', '某业务机器学习建模预测', 1, NULL, 2, 0, NULL, NULL, NULL, NULL); -INSERT INTO dss_homepage_demo_instance (id, menu_id, name, url, title_en, title_cn, description, is_active, icon, `order`, click_num, create_by, create_time, last_update_time, last_update_user) VALUES (NULL, 2, '某业务导出营销用户列表', 'https://github.com/WeBankFinTech/DataSphereStudio', 'business export marketing user list', '某业务导出营销用户列表', '某业务导出营销用户列表', 1, NULL, 3, 0, NULL, NULL, NULL, NULL); -INSERT INTO dss_homepage_demo_instance (id, menu_id, name, url, title_en, title_cn, description, is_active, icon, `order`, click_num, create_by, create_time, last_update_time, last_update_user) VALUES (NULL, 3, '数据大屏体验', 'https://github.com/WeBankFinTech/DataSphereStudio', 'data big screen experience', '数据大屏体验', '数据大屏体验', 1, NULL, 1, 0, NULL, NULL, NULL, NULL); -INSERT INTO dss_homepage_demo_instance (id, menu_id, name, url, title_en, title_cn, description, is_active, icon, `order`, click_num, create_by, create_time, last_update_time, last_update_user) VALUES (NULL, 3, '数据仪表盘体验', 'https://github.com/WeBankFinTech/DataSphereStudio', 'data dashboard experience', '数据仪表盘体验', '数据仪表盘体验', 1, NULL, 2, 0, NULL, NULL, NULL, NULL); -INSERT INTO dss_homepage_demo_instance (id, menu_id, name, url, title_en, title_cn, description, is_active, icon, `order`, click_num, create_by, create_time, last_update_time, last_update_user) VALUES (NULL, 3, '可视化挂件快速体验', 'https://github.com/WeBankFinTech/DataSphereStudio', 'visual widgets quick experience', '可视化挂件快速体验', '可视化挂件快速体验', 1, NULL, 3, 0, NULL, NULL, NULL, NULL); - -INSERT INTO dss_homepage_demo_menu (id, name, title_en, title_cn, description, is_active, icon, `order`, create_by, create_time, last_update_time, last_update_user) VALUES (1, 'workflow', 'workflow', '工作流', '工作流', 1, NULL, 1, NULL, NULL, NULL, NULL); -INSERT INTO dss_homepage_demo_menu (id, name, title_en, title_cn, description, is_active, icon, `order`, create_by, create_time, last_update_time, last_update_user) VALUES (2, 'application', 'application', '应用场景', '应用场景', 1, NULL, 2, NULL, NULL, NULL, NULL); -INSERT INTO dss_homepage_demo_menu (id, name, title_en, title_cn, description, is_active, icon, `order`, create_by, create_time, last_update_time, last_update_user) VALUES (3, 'visualization', 'visualization', '可视化', '可视化', 1, NULL, 3, NULL, NULL, NULL, NULL); - -INSERT INTO dss_homepage_video (id, name, url, title_en, title_cn, description, is_active, icon, `order`, play_num, create_by, create_time, last_update_time, last_update_user) VALUES (NULL, '10秒教你搭建工作流', 'https://sandbox.webank.com/wds/dss/videos/1.mp4', '10 sec how to build workflow', '10秒教你搭建工作流', '10秒教你搭建工作流', 1, NULL, 1, 0, NULL, NULL, NULL, NULL); -INSERT INTO dss_homepage_video (id, name, url, title_en, title_cn, description, is_active, icon, `order`, play_num, create_by, create_time, last_update_time, last_update_user) VALUES (NULL, '10秒教你发邮件', 'https://sandbox.webank.com/wds/dss/videos/1.mp4', '10 sec how to send email', '10秒教你发邮件', '10秒教你发邮件', 1, NULL, 2, 0, NULL, NULL, NULL, NULL); - -INSERT INTO dss_onestop_menu (id, name, title_en, title_cn, description, is_active, icon, `order`, create_by, create_time, last_update_time, last_update_user) VALUES (1, '应用开发', 'application development', '应用开发', '应用开发描述', 1, NULL, NULL, NULL, NULL, NULL, NULL); -INSERT INTO dss_onestop_menu (id, name, title_en, title_cn, description, is_active, icon, `order`, create_by, create_time, last_update_time, last_update_user) VALUES (2, '数据分析', 'data analysis', '数据分析', '数据分析描述', 1, NULL, NULL, NULL, NULL, NULL, NULL); -INSERT INTO dss_onestop_menu (id, name, title_en, title_cn, description, is_active, icon, `order`, create_by, create_time, last_update_time, last_update_user) VALUES (3, '生产运维', 'production operation', '生产运维', '生产运维描述', 1, NULL, NULL, NULL, NULL, NULL, NULL); -INSERT INTO dss_onestop_menu (id, name, title_en, title_cn, description, is_active, icon, `order`, create_by, create_time, last_update_time, last_update_user) VALUES (4, '数据质量', 'data quality', '数据质量', '数据质量描述', 1, NULL, NULL, NULL, NULL, NULL, NULL); -INSERT INTO dss_onestop_menu (id, name, title_en, title_cn, description, is_active, icon, `order`, create_by, create_time, last_update_time, last_update_user) VALUES (5, '管理员功能', 'administrator function', '管理员功能', '管理员功能描述', 0, NULL, NULL, NULL, NULL, NULL, NULL); - -INSERT INTO dss_onestop_menu_application (id, application_id, onestop_menu_id, title_en, title_cn, desc_en, desc_cn, labels_en, labels_cn, is_active, access_button_en, access_button_cn, manual_button_en, manual_button_cn, manual_button_url, icon, `order`, create_by, create_time, last_update_time, last_update_user) VALUES (NULL, NULL, 1, 'workflow development', '工作流开发', 'Workflow development is a data application development tool created by WeDataSphere with Linkis as the kernel.', '工作流开发是微众银行微数域(WeDataSphere)打造的数据应用开发工具,以任意桥(Linkis)做为内核,将满足从数据交换、脱敏清洗、分析挖掘、质量检测、可视化展现、定时调度到数据输出等数据应用开发全流程场景需求。', 'workflow, data warehouse development', '工作流,数仓开发', 1, 'enter workflow development', '进入工作流开发', 'user manual', '用户手册', 'https://github.com/WeBankFinTech/DataSphereStudio', 'fi-workflow|rgb(102, 102, 255)', NULL, NULL, NULL, NULL, NULL); -INSERT INTO dss_onestop_menu_application (id, application_id, onestop_menu_id, title_en, title_cn, desc_en, desc_cn, labels_en, labels_cn, is_active, access_button_en, access_button_cn, manual_button_en, manual_button_cn, manual_button_url, icon, `order`, create_by, create_time, last_update_time, last_update_user) VALUES (NULL, NULL, 1, 'StreamSQL development', 'StreamSQL开发', 'Real-time application development is a streaming solution jointly built by WeDataSphere, Boss big data team and China Telecom ctcloud Big data team.', '实时应用开发是微众银行微数域(WeDataSphere)、Boss直聘大数据团队 和 中国电信天翼云大数据团队 社区联合共建的流式解决方案,以 Linkis 做为内核,基于 Flink Engine 构建的批流统一的 Flink SQL,助力实时化转型。', 'streaming, realtime', '流式,实时', 0, 'under union construction', '联合共建中', 'related information', '相关资讯', 'https://github.com/WeBankFinTech/DataSphereStudio', 'fi-scriptis|rgb(102, 102, 255)', NULL, NULL, NULL, NULL, NULL); -INSERT INTO dss_onestop_menu_application (id, application_id, onestop_menu_id, title_en, title_cn, desc_en, desc_cn, labels_en, labels_cn, is_active, access_button_en, access_button_cn, manual_button_en, manual_button_cn, manual_button_url, icon, `order`, create_by, create_time, last_update_time, last_update_user) VALUES (NULL, NULL, 1, 'Data service development', '数据服务开发', 'Data service is a unified API service jointly built by WeDataSphere and Ihome Big data Team. With Linkis and DataSphere Studio as the kernel.', '数据服务是微众银行微数域(WeDataSphere)与 艾佳生活大数据团队 社区联合共建的统一API服务,以 Linkis 和 DataSphere Studio 做为内核,提供快速将 Scriptis 脚本生成数据API的能力,协助企业统一管理对内对外的API服务。', 'API, data service', 'API,数据服务', 0, 'under union construction', '联合共建中', 'related information', '相关资讯', 'https://github.com/WeBankFinTech/DataSphereStudio', 'fi-scriptis|rgb(102, 102, 255)', NULL, NULL, NULL, NULL, NULL); -INSERT INTO dss_onestop_menu_application (id, application_id, onestop_menu_id, title_en, title_cn, desc_en, desc_cn, labels_en, labels_cn, is_active, access_button_en, access_button_cn, manual_button_en, manual_button_cn, manual_button_url, icon, `order`, create_by, create_time, last_update_time, last_update_user) VALUES (NULL, NULL, 2, 'Scriptis', 'Scriptis', 'Scriptis is a one-stop interactive data exploration analysis tool built by WeDataSphere, uses Linkis as the kernel.', 'Scriptis是微众银行微数域(WeDataSphere)打造的一站式交互式数据探索分析工具,以任意桥(Linkis)做为内核,提供多种计算存储引擎(如Spark、Hive、TiSpark等)、Hive数据库管理功能、资源(如Yarn资源、服务器资源)管理、应用管理和各种用户资源(如UDF、变量等)管理的能力。', 'scripts development,IDE', '脚本开发,IDE', 1, 'enter Scriptis', '进入Scriptis', 'user manual', '用户手册', 'https://github.com/WeBankFinTech/DataSphereStudio', 'fi-scriptis|rgb(102, 102, 255)', NULL, NULL, NULL, NULL, NULL); -INSERT INTO dss_onestop_menu_application (id, application_id, onestop_menu_id, title_en, title_cn, desc_en, desc_cn, labels_en, labels_cn, is_active, access_button_en, access_button_cn, manual_button_en, manual_button_cn, manual_button_url, icon, `order`, create_by, create_time, last_update_time, last_update_user) VALUES (NULL, NULL, 2, 'Visualis', 'Visualis', 'Visualis is a data visualization BI tool based on Davinci, with Linkis as the kernel, it supports the analysis mode of data development exploration.', 'Visualis是基于宜信开源项目Davinci开发的数据可视化BI工具,以任意桥(Linkis)做为内核,支持拖拽式报表定义、图表联动、钻取、全局筛选、多维分析、实时查询等数据开发探索的分析模式,并做了水印、数据质量校验等金融级增强。', 'visualization, statement', '可视化,报表', 1, 'enter Visualis', '进入Visualis', 'user manual', '用户手册', 'https://github.com/WeBankFinTech/DataSphereStudio', 'fi-visualis|rgb(0, 153, 255)', NULL, NULL, NULL, NULL, NULL); -INSERT INTO dss_onestop_menu_application (id, application_id, onestop_menu_id, title_en, title_cn, desc_en, desc_cn, labels_en, labels_cn, is_active, access_button_en, access_button_cn, manual_button_en, manual_button_cn, manual_button_url, icon, `order`, create_by, create_time, last_update_time, last_update_user) VALUES (NULL, NULL, 3, 'Schedulis', 'Schedulis', 'Description for Schedulis.', 'Schedulis描述', 'scheduling, workflow', '调度,工作流', 1, 'enter Schedulis', '进入Schedulis', 'user manual', '用户手册', 'https://github.com/WeBankFinTech/DataSphereStudio', 'fi-schedule|rgb(102, 102, 204)', NULL, NULL, NULL, NULL, NULL); -INSERT INTO dss_onestop_menu_application (id, application_id, onestop_menu_id, title_en, title_cn, desc_en, desc_cn, labels_en, labels_cn, is_active, access_button_en, access_button_cn, manual_button_en, manual_button_cn, manual_button_url, icon, `order`, create_by, create_time, last_update_time, last_update_user) VALUES (NULL, NULL, 3, 'Application operation center', '应用运维中心', 'Description for Application operation center.', '应用运维中心描述', 'production, operation', '生产,运维', 0, 'enter application operation center', '进入应用运维中心', 'user manual', '用户手册', 'https://github.com/WeBankFinTech/DataSphereStudio', 'fi-scriptis|rgb(102, 102, 255)', NULL, NULL, NULL, NULL, NULL); -INSERT INTO dss_onestop_menu_application (id, application_id, onestop_menu_id, title_en, title_cn, desc_en, desc_cn, labels_en, labels_cn, is_active, access_button_en, access_button_cn, manual_button_en, manual_button_cn, manual_button_url, icon, `order`, create_by, create_time, last_update_time, last_update_user) VALUES (NULL, NULL, 4, 'Qualitis', 'Qualitis', 'Qualitis is a financial and one-stop data quality management platform that provides data quality model definition, visualization and monitoring of data quality results', 'Qualitis是一套金融级、一站式的数据质量管理平台,提供了数据质量模型定义,数据质量结果可视化、可监控等功能,并用一整套统一的流程来定义和检测数据集的质量并及时报告问题。', 'product, operations', '生产,运维', 1, 'enter Qualitis', '进入Qualitis', 'user manual', '用户手册', 'https://github.com/WeBankFinTech/DataSphereStudio', 'fi-qualitis|rgb(51, 153, 153)', NULL, NULL, NULL, NULL, NULL); -INSERT INTO dss_onestop_menu_application (id, application_id, onestop_menu_id, title_en, title_cn, desc_en, desc_cn, labels_en, labels_cn, is_active, access_button_en, access_button_cn, manual_button_en, manual_button_cn, manual_button_url, icon, `order`, create_by, create_time, last_update_time, last_update_user) VALUES (NULL, NULL, 4, 'Exchangis', 'Exchangis', 'Exchangis is a lightweight, high scalability, data exchange platform, support for structured and unstructured data transmission between heterogeneous data sources.', 'Exchangis是一个轻量级的、高扩展性的数据交换平台,支持对结构化及无结构化的异构数据源之间的数据传输,在应用层上具有数据权限管控、节点服务高可用和多租户资源隔离等业务特性,而在数据层上又具有传输架构多样化、模块插件化和组件低耦合等架构特点。', 'user manual', '生产,运维', 1, 'enter Exchangis', '进入Exchangis', 'user manual', '用户手册', 'https://github.com/WeBankFinTech/DataSphereStudio', 'fi-exchange|(102, 102, 255)', NULL, NULL, NULL, NULL, NULL); -INSERT INTO dss_onestop_menu_application (id, application_id, onestop_menu_id, title_en, title_cn, desc_en, desc_cn, labels_en, labels_cn, is_active, access_button_en, access_button_cn, manual_button_en, manual_button_cn, manual_button_url, icon, `order`, create_by, create_time, last_update_time, last_update_user) VALUES (NULL, NULL, 5, 'Workspace management', '工作空间管理', NULL, NULL, NULL, NULL, 1, 'workspace management', '工作空间管理', null, null, null, 'fi-scriptis|rgb(102, 102, 255)', null, null, null, null, null); -INSERT INTO dss_onestop_menu_application (id, application_id, onestop_menu_id, title_en, title_cn, desc_en, desc_cn, labels_en, labels_cn, is_active, access_button_en, access_button_cn, manual_button_en, manual_button_cn, manual_button_url, icon, `order`, create_by, create_time, last_update_time, last_update_user) VALUES (NULL, NULL, 5, 'User resources management', '用户资源管理', NULL, NULL, NULL, NULL, 1, 'user resource management', '用户资源管理', null, null, null, 'fi-scriptis|rgb(102, 102, 255)', null, null, null, null, null); +DELETE FROM dss_workspace; +insert into `dss_workspace` (`name`, `label`, `description`, `create_by`, `create_time`, `department`, `product`, `source`, `last_update_time`, `last_update_user`) values('bdapWorkspace','','bdapWorkspace','hadoop','2020-07-13 02:39:41','企业直通银行部','bdapWorkspace',NULL,'2020-07-13 02:39:41','hadoop'); +DELETE FROM dss_dictionary; +insert into `dss_dictionary`(`id`,`workspace_id`,`parent_key`,`dic_name`,`dic_name_en`,`dic_key`,`dic_value`,`dic_value_en`,`title`,`title_en`,`url`,`url_type`,`icon`,`order_num`,`remark`,`create_user`,`create_time`,`update_user`,`update_time`) values (1,0,'0','空间开发流程','Space development process','w_develop_process',NULL,NULL,NULL,NULL,NULL,0,NULL,1,'空间开发流程','SYSTEM','2020-12-28 17:32:34',NULL,'2021-02-22 17:46:40'); +insert into `dss_dictionary`(`id`,`workspace_id`,`parent_key`,`dic_name`,`dic_name_en`,`dic_key`,`dic_value`,`dic_value_en`,`title`,`title_en`,`url`,`url_type`,`icon`,`order_num`,`remark`,`create_user`,`create_time`,`update_user`,`update_time`) values (2,0,'w_develop_process','需求','Demand','wdp_demand','创建新的业务需求,并将需求指派给对应负责人。','Create new business requirements and assign them to the corresponding responsible person.','Demo案例','Demo case',NULL,0,'xuqiu',1,'空间开发流程-需求','SYSTEM','2020-12-28 17:32:35',NULL,'2021-02-23 09:38:07'); +insert into `dss_dictionary`(`id`,`workspace_id`,`parent_key`,`dic_name`,`dic_name_en`,`dic_key`,`dic_value`,`dic_value_en`,`title`,`title_en`,`url`,`url_type`,`icon`,`order_num`,`remark`,`create_user`,`create_time`,`update_user`,`update_time`) values (3,0,'w_develop_process','设计','Design','wdp_design','针对新的业务需求,进行数仓规划和库表设计。','According to the new business requirements, data warehouse planning and database table design are carried out.','Demo案例','Demo case',NULL,0,'sheji',1,'空间开发流程-设计','SYSTEM','2020-12-28 17:32:35',NULL,'2021-02-23 09:38:09'); +insert into `dss_dictionary`(`id`,`workspace_id`,`parent_key`,`dic_name`,`dic_name_en`,`dic_key`,`dic_value`,`dic_value_en`,`title`,`title_en`,`url`,`url_type`,`icon`,`order_num`,`remark`,`create_user`,`create_time`,`update_user`,`update_time`) values (4,0,'w_develop_process','开发','Development','wdp_development','针对新的业务需求,进行数仓规划和库表设计。','According to the new business requirements, data warehouse planning and database table design are carried out.','Demo案例','Demo case',NULL,0,'kaifa',1,'空间开发流程-开发','SYSTEM','2020-12-28 17:32:35',NULL,'2021-02-23 09:38:10'); +insert into `dss_dictionary`(`id`,`workspace_id`,`parent_key`,`dic_name`,`dic_name_en`,`dic_key`,`dic_value`,`dic_value_en`,`title`,`title_en`,`url`,`url_type`,`icon`,`order_num`,`remark`,`create_user`,`create_time`,`update_user`,`update_time`) values (5,0,'w_develop_process','调试','Debugging','wdp_debug','创建新的业务需求,并将需求指派给对应负责人。','Create new business requirements and assign them to the corresponding responsible person.','Demo案例','Demo case',NULL,0,'tiaoshi',1,'空间开发流程-调试','SYSTEM','2020-12-28 17:32:35',NULL,'2021-02-23 09:38:11'); +insert into `dss_dictionary`(`id`,`workspace_id`,`parent_key`,`dic_name`,`dic_name_en`,`dic_key`,`dic_value`,`dic_value_en`,`title`,`title_en`,`url`,`url_type`,`icon`,`order_num`,`remark`,`create_user`,`create_time`,`update_user`,`update_time`) values (6,0,'w_develop_process','生产','Production','wdp_product','创建新的业务需求,并将需求指派给对应负责人。','Create new business requirements and assign them to the corresponding responsible person.','Demo案例','Demo case',NULL,0,'shengchan',1,'空间开发流程-生产','SYSTEM','2020-12-28 17:32:35',NULL,'2021-02-23 09:38:12'); +insert into `dss_dictionary`(`id`,`workspace_id`,`parent_key`,`dic_name`,`dic_name_en`,`dic_key`,`dic_value`,`dic_value_en`,`title`,`title_en`,`url`,`url_type`,`icon`,`order_num`,`remark`,`create_user`,`create_time`,`update_user`,`update_time`) values (7,0,'0','工程开发流程','Engineering development process','p_develop_process',NULL,NULL,NULL,NULL,NULL,0,NULL,1,'工程开发流程','SYSTEM','2020-12-28 17:32:35',NULL,'2021-02-22 17:48:48'); +insert into `dss_dictionary`(`id`,`workspace_id`,`parent_key`,`dic_name`,`dic_name_en`,`dic_key`,`dic_value`,`dic_value_en`,`title`,`title_en`,`url`,`url_type`,`icon`,`order_num`,`remark`,`create_user`,`create_time`,`update_user`,`update_time`) values (8,0,'p_develop_process','开发中心','Development Center','pdp_development_center','dev',NULL,NULL,NULL,NULL,0,'kaifa-icon',1,'工程开发流程-开发中心','SYSTEM','2020-12-28 17:32:35',NULL,'2021-02-22 17:49:02'); +insert into `dss_dictionary`(`id`,`workspace_id`,`parent_key`,`dic_name`,`dic_name_en`,`dic_key`,`dic_value`,`dic_value_en`,`title`,`title_en`,`url`,`url_type`,`icon`,`order_num`,`remark`,`create_user`,`create_time`,`update_user`,`update_time`) values (10,0,'0','工程编排模式','Project layout mode','p_orchestrator_mode',NULL,NULL,NULL,NULL,NULL,0,NULL,1,'工程编排模式','SYSTEM','2020-12-28 17:32:35',NULL,'2021-02-22 17:49:36'); +insert into `dss_dictionary`(`id`,`workspace_id`,`parent_key`,`dic_name`,`dic_name_en`,`dic_key`,`dic_value`,`dic_value_en`,`title`,`title_en`,`url`,`url_type`,`icon`,`order_num`,`remark`,`create_user`,`create_time`,`update_user`,`update_time`) values (11,0,'p_orchestrator_mode','工作流','Workflow','pom_work_flow','radio',NULL,NULL,NULL,NULL,0,'gongzuoliu-icon',1,'工程编排模式-工作流','SYSTEM','2020-12-28 17:32:35',NULL,'2021-02-22 17:49:49'); +insert into `dss_dictionary`(`id`,`workspace_id`,`parent_key`,`dic_name`,`dic_name_en`,`dic_key`,`dic_value`,`dic_value_en`,`title`,`title_en`,`url`,`url_type`,`icon`,`order_num`,`remark`,`create_user`,`create_time`,`update_user`,`update_time`) values (14,0,'pom_work_flow','DAG','DAG','pom_work_flow_DAG',NULL,NULL,NULL,NULL,NULL,0,NULL,1,'工程编排模式-工作流-DAG','SYSTEM','2020-12-28 17:32:35',NULL,'2021-02-22 17:50:31'); +insert into `dss_dictionary`(`id`,`workspace_id`,`parent_key`,`dic_name`,`dic_name_en`,`dic_key`,`dic_value`,`dic_value_en`,`title`,`title_en`,`url`,`url_type`,`icon`,`order_num`,`remark`,`create_user`,`create_time`,`update_user`,`update_time`) values (16,0,'pom_single_task','Scriptis','Scriptis','pom_single_task_scriptis',NULL,NULL,NULL,NULL,NULL,0,NULL,1,'工程编排模式-单任务-Scriptis','SYSTEM','2020-12-28 17:32:35',NULL,'2021-02-22 17:51:08'); +insert into `dss_dictionary`(`id`,`workspace_id`,`parent_key`,`dic_name`,`dic_name_en`,`dic_key`,`dic_value`,`dic_value_en`,`title`,`title_en`,`url`,`url_type`,`icon`,`order_num`,`remark`,`create_user`,`create_time`,`update_user`,`update_time`) values (18,0,'pom_single_task','Qualitis','Qualitis','pom_single_task_qualitis',NULL,NULL,NULL,NULL,NULL,0,NULL,1,'工程编排模式-单任务-Qualitis','SYSTEM','2020-12-28 17:32:35',NULL,'2021-02-22 17:50:53'); +insert into `dss_dictionary`(`id`,`workspace_id`,`parent_key`,`dic_name`,`dic_name_en`,`dic_key`,`dic_value`,`dic_value_en`,`title`,`title_en`,`url`,`url_type`,`icon`,`order_num`,`remark`,`create_user`,`create_time`,`update_user`,`update_time`) values (20,0,'pom_consist_orchestrator','Qualitis','Qualitis','pom_consist_orchestrator_qualitis',NULL,NULL,NULL,NULL,NULL,0,NULL,1,'工程编排模式-组合编排-Qualitis','SYSTEM','2020-12-28 17:32:35',NULL,'2021-02-22 17:50:57'); +insert into `dss_dictionary`(`id`,`workspace_id`,`parent_key`,`dic_name`,`dic_name_en`,`dic_key`,`dic_value`,`dic_value_en`,`title`,`title_en`,`url`,`url_type`,`icon`,`order_num`,`remark`,`create_user`,`create_time`,`update_user`,`update_time`) values (21,0,'pom_consist_orchestrator','Email','Email','pom_consist_orchestrator_email',NULL,NULL,NULL,NULL,NULL,0,NULL,1,'工程编排模式-组合编排-Email','SYSTEM','2020-12-28 17:32:35',NULL,'2021-02-22 17:51:22'); +insert into `dss_dictionary` (`workspace_id`, `parent_key`, `dic_name`, `dic_name_en`, `dic_key`, `dic_value`, `dic_value_en`, `title`, `title_en`, `url`, `url_type`, `icon`, `order_num`, `remark`, `create_user`, `create_time`, `update_user`, `update_time`) values('0','0','工作空间默认部门','Space development name','w_workspace_department','10001-部门一;10002-部门二;10003-部门三',NULL,NULL,NULL,NULL,'0',NULL,'1','工作空间默认部门,前面是id后面是部门名称中间使用‘-’,横杆分隔,多个以英文分号分隔','SYSTEM','2020-12-28 17:32:34',NULL,'2021-02-22 17:46:40'); + +DELETE FROM dss_menu; +insert into `dss_menu`(`id`,`name`,`level`,`upper_menu_id`,`front_name`,`comment`,`description`,`is_active`,`is_component`,`icon`,`application_id`) values (2,'develop_center','1',0,'开发中心',NULL,NULL,1,0,'icon-kaifazhongxinmorenzhuangtai',0); +insert into `dss_menu`(`id`,`name`,`level`,`upper_menu_id`,`front_name`,`comment`,`description`,`is_active`,`is_component`,`icon`,`application_id`) values (3,'analysis_center','1',0,'分析中心',NULL,NULL,1,0,'icon-fenxizhongxin',0); +insert into `dss_menu`(`id`,`name`,`level`,`upper_menu_id`,`front_name`,`comment`,`description`,`is_active`,`is_component`,`icon`,`application_id`) values (8,'schedule_center','2',1,'调度中心',NULL,NULL,1,0,NULL,0); +insert into `dss_menu`(`id`,`name`,`level`,`upper_menu_id`,`front_name`,`comment`,`description`,`is_active`,`is_component`,`icon`,`application_id`) values (12,'workflow','2',2,'工作流开发',NULL,NULL,1,1,NULL,2); +insert into `dss_menu`(`id`,`name`,`level`,`upper_menu_id`,`front_name`,`comment`,`description`,`is_active`,`is_component`,`icon`,`application_id`) values (16,'scriptis','2',3,'意书(Scrptis)',NULL,NULL,1,1,NULL,1); +insert into `dss_menu`(`id`,`name`,`level`,`upper_menu_id`,`front_name`,`comment`,`description`,`is_active`,`is_component`,`icon`,`application_id`) values (28,'settings','1',0,'设置',NULL,NULL,1,0,'icon-shezhi',0); +insert into `dss_menu`(`id`,`name`,`level`,`upper_menu_id`,`front_name`,`comment`,`description`,`is_active`,`is_component`,`icon`,`application_id`) values (29,'workspace_setting','2',28,'工作空间设置',NULL,NULL,1,0,NULL,0); +insert into `dss_menu`(`id`,`name`,`level`,`upper_menu_id`,`front_name`,`comment`,`description`,`is_active`,`is_component`,`icon`,`application_id`) values (30,'user_manage','2',28,'用户管理',NULL,NULL,1,0,NULL,0); +insert into `dss_menu`(`id`,`name`,`level`,`upper_menu_id`,`front_name`,`comment`,`description`,`is_active`,`is_component`,`icon`,`application_id`) values (31,'priv_manage','2',28,'权限管理',NULL,NULL,1,0,NULL,0); +insert into `dss_menu`(`id`,`name`,`level`,`upper_menu_id`,`front_name`,`comment`,`description`,`is_active`,`is_component`,`icon`,`application_id`) values (32,'apiService','2',2,'数据服务',NULL,NULL,1,1,NULL,11); + +DELETE FROM dss_role; +insert into `dss_role` (`id`, `workspace_id`, `name`, `front_name`, `update_time`, `description`) values('1','-1','admin','管理员','2020-07-13 02:43:35','通用角色管理员'); +insert into `dss_role` (`id`, `workspace_id`, `name`, `front_name`, `update_time`, `description`) values('2','-1','maintenance','运维用户','2020-07-13 02:43:35','通用角色运维用户'); +insert into `dss_role` (`id`, `workspace_id`, `name`, `front_name`, `update_time`, `description`) values('3','-1','developer','开发用户','2020-07-13 02:43:35','通用角色开发用户'); +insert into `dss_role` (`id`, `workspace_id`, `name`, `front_name`, `update_time`, `description`) values('4','-1','analyser','分析用户','2020-07-13 02:43:36','通用角色分析用户'); +insert into `dss_role` (`id`, `workspace_id`, `name`, `front_name`, `update_time`, `description`) values('5','-1','operator','运营用户','2020-07-13 02:43:36','通用角色运营用户'); +insert into `dss_role` (`id`, `workspace_id`, `name`, `front_name`, `update_time`, `description`) values('6','-1','boss','领导','2020-07-13 02:43:36','通用角色领导'); + +DELETE FROM dss_sidebar; +insert into `dss_sidebar`(`id`,`workspace_id`,`name`,`name_en`,`title`,`title_en`,`type`,`order_num`,`remark`,`create_user`,`create_time`,`update_user`,`update_time`) values (1,0,'知识库','Knowledge base','知识库','Knowledge base',0,1,NULL,'SYSTEM','2020-12-15 13:21:06',NULL,'2021-02-23 09:45:41'); +insert into `dss_sidebar`(`id`,`workspace_id`,`name`,`name_en`,`title`,`title_en`,`type`,`order_num`,`remark`,`create_user`,`create_time`,`update_user`,`update_time`) values (2,0,'菜单','Menu','菜单','Menu',1,1,NULL,'SYSTEM','2020-12-15 13:21:06',NULL,'2021-02-23 09:45:50'); +insert into `dss_sidebar`(`id`,`workspace_id`,`name`,`name_en`,`title`,`title_en`,`type`,`order_num`,`remark`,`create_user`,`create_time`,`update_user`,`update_time`) values (3,0,'常见问题','Common problem','常见问题','Common problem',1,1,NULL,'SYSTEM','2020-12-15 13:21:06',NULL,'2021-02-23 09:46:18'); + +DELETE FROM dss_sidebar_content; +insert into `dss_sidebar_content`(`id`,`workspace_id`,`sidebar_id`,`name`,`name_en`,`title`,`title_en`,`url`,`url_type`,`icon`,`order_num`,`remark`,`create_user`,`create_time`,`update_user`,`update_time`) values (1,0,1,NULL,NULL,'部门队列申请','Department application','http://127.0.0.1:8088/kn/d/38',1,'wendang',1,NULL,'SYSTEM','2020-12-15 13:21:06',NULL,'2021-02-23 09:47:29'); +insert into `dss_sidebar_content`(`id`,`workspace_id`,`sidebar_id`,`name`,`name_en`,`title`,`title_en`,`url`,`url_type`,`icon`,`order_num`,`remark`,`create_user`,`create_time`,`update_user`,`update_time`) values (2,0,1,NULL,NULL,'部门导数申请','Departmental derivative','http://127.0.0.1:8088/kn/d/40',1,'wendang',1,NULL,'SYSTEM','2020-12-15 13:21:07',NULL,'2021-02-23 09:47:36'); +insert into `dss_sidebar_content`(`id`,`workspace_id`,`sidebar_id`,`name`,`name_en`,`title`,`title_en`,`url`,`url_type`,`icon`,`order_num`,`remark`,`create_user`,`create_time`,`update_user`,`update_time`) values (3,0,2,NULL,NULL,'工作空间管理','Workspace management','/workspaceManagement/productsettings',0,'menuIcon',1,NULL,'SYSTEM','2020-12-15 13:21:07',NULL,'2021-02-23 09:47:49'); +insert into `dss_sidebar_content`(`id`,`workspace_id`,`sidebar_id`,`name`,`name_en`,`title`,`title_en`,`url`,`url_type`,`icon`,`order_num`,`remark`,`create_user`,`create_time`,`update_user`,`update_time`) values (4,0,2,NULL,NULL,'知识库','Knowledge base','http://127.0.0.1:8088/kn/d/40',1,'menuIcon',1,NULL,'SYSTEM','2020-12-15 13:21:07',NULL,'2021-02-23 09:47:11'); +insert into `dss_sidebar_content`(`id`,`workspace_id`,`sidebar_id`,`name`,`name_en`,`title`,`title_en`,`url`,`url_type`,`icon`,`order_num`,`remark`,`create_user`,`create_time`,`update_user`,`update_time`) values (5,0,3,NULL,NULL,'资源配置说明',NULL,'http://127.0.0.1:8088/kn/d/38',1,'fi-warn',1,NULL,'SYSTEM','2020-12-15 13:21:07',NULL,'2021-01-12 17:16:52'); +insert into `dss_sidebar_content`(`id`,`workspace_id`,`sidebar_id`,`name`,`name_en`,`title`,`title_en`,`url`,`url_type`,`icon`,`order_num`,`remark`,`create_user`,`create_time`,`update_user`,`update_time`) values (6,0,3,NULL,NULL,'Spark使用指南','[Discussion on error code 22223]','http://127.0.0.1:8088/kn/d/40',1,'fi-warn',1,NULL,'SYSTEM','2020-12-15 13:21:07',NULL,'2021-02-23 09:48:28'); +insert into `dss_sidebar_content`(`id`,`workspace_id`,`sidebar_id`,`name`,`name_en`,`title`,`title_en`,`url`,`url_type`,`icon`,`order_num`,`remark`,`create_user`,`create_time`,`update_user`,`update_time`) values (7,0,3,NULL,NULL,'Hive语法介绍',NULL,'http://127.0.0.1:8088/kn/d/34',1,'fi-warn',1,NULL,'SYSTEM','2020-12-15 13:21:07',NULL,'2021-01-12 17:17:00'); +insert into `dss_sidebar_content`(`id`,`workspace_id`,`sidebar_id`,`name`,`name_en`,`title`,`title_en`,`url`,`url_type`,`icon`,`order_num`,`remark`,`create_user`,`create_time`,`update_user`,`update_time`) values (8,0,3,NULL,NULL,'工作流使用介绍',NULL,'http://127.0.0.1:8088/kn/d/42',1,'fi-warn',1,NULL,'SYSTEM','2020-12-15 13:21:07',NULL,'2021-01-12 17:17:01'); +insert into `dss_sidebar_content`(`id`,`workspace_id`,`sidebar_id`,`name`,`name_en`,`title`,`title_en`,`url`,`url_type`,`icon`,`order_num`,`remark`,`create_user`,`create_time`,`update_user`,`update_time`) values (9,0,3,NULL,NULL,'数据服务使用介绍','Discussion on error code 22223','http://127.0.0.1:8088/kn/d/32',1,'fi-warn',1,NULL,'SYSTEM','2020-12-15 13:21:07',NULL,'2021-02-23 09:48:19'); + +DELETE FROM dss_onestop_menu; +INSERT INTO `dss_onestop_menu` (`id`, `name`, `title_en`, `title_cn`, `description`, `is_active`, `icon`, `order`, `create_by`, `create_time`, `last_update_time`, `last_update_user`) VALUES('1','应用开发','application development','应用开发','应用开发描述','1',NULL,NULL,NULL,NULL,NULL,NULL); +INSERT INTO `dss_onestop_menu` (`id`, `name`, `title_en`, `title_cn`, `description`, `is_active`, `icon`, `order`, `create_by`, `create_time`, `last_update_time`, `last_update_user`) VALUES('2','数据分析','data analysis','数据分析','数据分析描述','1',NULL,NULL,NULL,NULL,NULL,NULL); +INSERT INTO `dss_onestop_menu` (`id`, `name`, `title_en`, `title_cn`, `description`, `is_active`, `icon`, `order`, `create_by`, `create_time`, `last_update_time`, `last_update_user`) VALUES('3','生产运维','production operation','生产运维','生产运维描述','1',NULL,NULL,NULL,NULL,NULL,NULL); +INSERT INTO `dss_onestop_menu` (`id`, `name`, `title_en`, `title_cn`, `description`, `is_active`, `icon`, `order`, `create_by`, `create_time`, `last_update_time`, `last_update_user`) VALUES('4','数据质量','data quality','数据质量','数据质量描述','1',NULL,NULL,NULL,NULL,NULL,NULL); +INSERT INTO `dss_onestop_menu` (`id`, `name`, `title_en`, `title_cn`, `description`, `is_active`, `icon`, `order`, `create_by`, `create_time`, `last_update_time`, `last_update_user`) VALUES('5','管理员功能','administrator function','管理员功能','管理员功能描述','0',NULL,NULL,NULL,NULL,NULL,NULL); + +DELETE FROM dss_onestop_menu_application; +INSERT INTO `dss_onestop_menu_application` (`id`, `application_id`, `onestop_menu_id`, `title_en`, `title_cn`, `desc_en`, `desc_cn`, `labels_en`, `labels_cn`, `is_active`, `access_button_en`, `access_button_cn`, `manual_button_en`, `manual_button_cn`, `manual_button_url`, `icon`, `order`, `create_by`, `create_time`, `last_update_time`, `last_update_user`, `image`) VALUES('2',NULL,'1','StreamSQL development','StreamSQL开发','Real-time application development is a streaming solution jointly built by WeDataSphere, Boss big data team and China Telecom ctcloud Big data team.','实时应用开发是微众银行微数域(WeDataSphere)、Boss直聘大数据团队 和 中国电信天翼云大数据团队 社区联合共建的流式解决方案,以 Linkis 做为内核,基于 Flink Engine 构建的批流统一的 Flink SQL,助力实时化转型。','streaming, realtime','流式,实时','0','under union construction','联合共建中','related information','相关资讯','http://127.0.0.1:8088/wiki/scriptis/manual/workspace_cn.html','shujukaifa-logo',NULL,NULL,NULL,NULL,NULL,'shujukaifa-icon'); +INSERT INTO `dss_onestop_menu_application` (`id`, `application_id`, `onestop_menu_id`, `title_en`, `title_cn`, `desc_en`, `desc_cn`, `labels_en`, `labels_cn`, `is_active`, `access_button_en`, `access_button_cn`, `manual_button_en`, `manual_button_cn`, `manual_button_url`, `icon`, `order`, `create_by`, `create_time`, `last_update_time`, `last_update_user`, `image`) VALUES('3','11','1','Data service development','数据服务开发','Data service is a unified API service jointly built by WeDataSphere and Ihome Big data Team. With Linkis and DataSphere Studio as the kernel.','数据服务是微众银行微数域(WeDataSphere)与 艾佳生活大数据团队 社区联合共建的统一API服务,以 Linkis 和 DataSphere Studio 做为内核,提供快速将 Scriptis 脚本生成数据API的能力,协助企业统一管理对内对外的API服务。','API, data service','API,数据服务','1','under union construction','进入数据服务','related information','相关资讯','http://127.0.0.1:8088/wiki/scriptis/manual/workspace_cn.html','shujufuwu-logo',NULL,NULL,NULL,NULL,NULL,'shujufuwu-icon'); +INSERT INTO `dss_onestop_menu_application` (`id`, `application_id`, `onestop_menu_id`, `title_en`, `title_cn`, `desc_en`, `desc_cn`, `labels_en`, `labels_cn`, `is_active`, `access_button_en`, `access_button_cn`, `manual_button_en`, `manual_button_cn`, `manual_button_url`, `icon`, `order`, `create_by`, `create_time`, `last_update_time`, `last_update_user`, `image`) VALUES('4','1','2','Scriptis','Scriptis','Scriptis is a one-stop interactive data exploration analysis tool built by WeDataSphere, uses Linkis as the kernel.','Scriptis是微众银行微数域(WeDataSphere)打造的一站式交互式数据探索分析工具,以任意桥(Linkis)做为内核,提供多种计算存储引擎(如Spark、Hive、TiSpark等)、Hive数据库管理功能、资源(如Yarn资源、服务器资源)管理、应用管理和各种用户资源(如UDF、变量等)管理的能力。','scripts development,IDE','脚本开发,IDE','1','enter Scriptis','进入Scriptis','user manual','用户手册','http://127.0.0.1:8088/wiki/scriptis/manual/workspace_cn.html','shujukaifa-logo',NULL,NULL,NULL,NULL,NULL,'shujukaifa-icon'); + +INSERT INTO `dss_onestop_menu_application` (`id`, `application_id`, `onestop_menu_id`, `title_en`, `title_cn`, `desc_en`, `desc_cn`, `labels_en`, `labels_cn`, `is_active`, `access_button_en`, `access_button_cn`, `manual_button_en`, `manual_button_cn`, `manual_button_url`, `icon`, `order`, `create_by`, `create_time`, `last_update_time`, `last_update_user`, `image`) VALUES('9',NULL,'4','Exchangis','Exchangis','Exchangis is a lightweight, high scalability, data exchange platform, support for structured and unstructured data transmission between heterogeneous data sources.','Exchangis是一个轻量级的、高扩展性的数据交换平台,支持对结构化及无结构化的异构数据源之间的数据传输,在应用层上具有数据权限管控、节点服务高可用和多租户资源隔离等业务特性,而在数据层上又具有传输架构多样化、模块插件化和组件低耦合等架构特点。','user manual','生产,运维','0','enter Exchangis','进入Exchangis','user manual','用户手册','http://127.0.0.1:8088/wiki/scriptis/manual/workspace_cn.html','shujujiaohuan-logo',NULL,NULL,NULL,NULL,NULL,'shujujiaohuan-icon'); +INSERT INTO `dss_onestop_menu_application` (`id`, `application_id`, `onestop_menu_id`, `title_en`, `title_cn`, `desc_en`, `desc_cn`, `labels_en`, `labels_cn`, `is_active`, `access_button_en`, `access_button_cn`, `manual_button_en`, `manual_button_cn`, `manual_button_url`, `icon`, `order`, `create_by`, `create_time`, `last_update_time`, `last_update_user`, `image`) VALUES('10','7','5','Workspace management','工作空间管理',NULL,NULL,NULL,NULL,'1','workspace management','工作空间管理',NULL,NULL,NULL,'shujukaifa-logo',NULL,NULL,NULL,NULL,NULL,'shujukaifa-icon'); +INSERT INTO `dss_onestop_menu_application` (`id`, `application_id`, `onestop_menu_id`, `title_en`, `title_cn`, `desc_en`, `desc_cn`, `labels_en`, `labels_cn`, `is_active`, `access_button_en`, `access_button_cn`, `manual_button_en`, `manual_button_cn`, `manual_button_url`, `icon`, `order`, `create_by`, `create_time`, `last_update_time`, `last_update_user`, `image`) VALUES('11',NULL,'5','User resources management','用户资源管理',NULL,NULL,NULL,NULL,'1','user resource management','用户资源管理',NULL,NULL,NULL,'shujukaifa-logo',NULL,NULL,NULL,NULL,NULL,'shujukaifa-icon'); + +DELETE FROM dss_role; +DELETE FROM dss_workspace_user_role; +INSERT INTO `dss_role` (`id`, `workspace_id`, `name`, `front_name`, `update_time`, `description`) VALUES('1','-1','admin','管理员','2020-07-13 02:43:35','通用角色管理员'); +INSERT INTO `dss_role` (`id`, `workspace_id`, `name`, `front_name`, `update_time`, `description`) VALUES('2','-1','maintenance','运维用户','2020-07-13 02:43:35','通用角色运维用户'); +INSERT INTO `dss_role` (`id`, `workspace_id`, `name`, `front_name`, `update_time`, `description`) VALUES('3','-1','developer','开发用户','2020-07-13 02:43:35','通用角色开发用户'); +INSERT INTO `dss_role` (`id`, `workspace_id`, `name`, `front_name`, `update_time`, `description`) VALUES('4','-1','analyser','分析用户','2020-07-13 02:43:36','通用角色分析用户'); +INSERT INTO `dss_role` (`id`, `workspace_id`, `name`, `front_name`, `update_time`, `description`) VALUES('5','-1','operator','运营用户','2020-07-13 02:43:36','通用角色运营用户'); +INSERT INTO `dss_role` (`id`, `workspace_id`, `name`, `front_name`, `update_time`, `description`) VALUES('6','-1','boss','领导','2020-07-13 02:43:36','通用角色领导'); +INSERT INTO `dss_role` (`id`, `workspace_id`, `name`, `front_name`, `update_time`, `description`) VALUES('7','-1','apiUser','数据服务用户','2020-08-21 11:35:02','通用角色数据服务用户'); + +DELETE FROM dss_application; +INSERT INTO `dss_application`(`id`,`name`,`url`,`is_user_need_init`,`level`,`user_init_url`,`exists_project_service`,`project_url`,`enhance_json`,`if_iframe`,`homepage_url`,`redirect_url`) VALUES (1,'linkis','http://127.0.0.1:9001',0,1,NULL,0,'/home','{\"watermark\":false,\"rsDownload\":true}',0,'/home',NULL); +INSERT INTO `dss_application`(`id`,`name`,`url`,`is_user_need_init`,`level`,`user_init_url`,`exists_project_service`,`project_url`,`enhance_json`,`if_iframe`,`homepage_url`,`redirect_url`) VALUES (4,'workflow','http://127.0.0.1:9001',0,1,NULL,0,'/workflow',NULL,0,'/project',NULL); +INSERT INTO `dss_application`(`id`,`name`,`url`,`is_user_need_init`,`level`,`user_init_url`,`exists_project_service`,`project_url`,`enhance_json`,`if_iframe`,`homepage_url`,`redirect_url`) VALUES (5,'console',NULL,0,1,NULL,0,'/console',NULL,0,'/console',NULL); +INSERT INTO `dss_application`(`id`,`name`,`url`,`is_user_need_init`,`level`,`user_init_url`,`exists_project_service`,`project_url`,`enhance_json`,`if_iframe`,`homepage_url`,`redirect_url`) VALUES (7,'workspace management','/workspaceManagement',0,1,NULL,0,NULL,NULL,NULL,'/workspaceManagement',NULL); +INSERT INTO `dss_application`(`id`,`name`,`url`,`is_user_need_init`,`level`,`user_init_url`,`exists_project_service`,`project_url`,`enhance_json`,`if_iframe`,`homepage_url`,`redirect_url`) VALUES (11,'apiService','http://127.0.0.1:9001',0,1,NULL,0,'/apiservices',NULL,0,'/apiservices',NULL); + +UPDATE `dss_application` SET url = 'http://GATEWAY_INSTALL_IP:GATEWAY_PORT' WHERE `name` in('linkis','workflow'); + +DELETE FROM dss_project_taxonomy; +INSERT INTO `dss_project_taxonomy` (`id`, `name`, `description`, `creator`, `create_time`, `update_time`) VALUES (NULL, 'My project', NULL, '-1', NULL, NULL); + +DELETE FROM dss_workflow_node; +insert into `dss_workflow_node` (`id`, `name`, `appconn_name`, `node_type`, `jump_url`, `support_jump`, `submit_to_scheduler`, `enable_copy`, `should_creation_before_node`, `icon`) values('1','python','-1','linkis.python.python',NULL,'1','1','1','0','pythonCreated with Sketch.'); +insert into `dss_workflow_node` (`id`, `name`, `appconn_name`, `node_type`, `jump_url`, `support_jump`, `submit_to_scheduler`, `enable_copy`, `should_creation_before_node`, `icon`) values('2','pyspark','-1','linkis.spark.py',NULL,'1','1','1','0','pysparkCreated with Sketch.'); +insert into `dss_workflow_node` (`id`, `name`, `appconn_name`, `node_type`, `jump_url`, `support_jump`, `submit_to_scheduler`, `enable_copy`, `should_creation_before_node`, `icon`) values('3','sql','-1','linkis.spark.sql',NULL,'1','1','1','0','sqlCreated with Sketch.'); +insert into `dss_workflow_node` (`id`, `name`, `appconn_name`, `node_type`, `jump_url`, `support_jump`, `submit_to_scheduler`, `enable_copy`, `should_creation_before_node`, `icon`) values('4','scala','-1','linkis.spark.scala',NULL,'1','1','1','0','ScalaCreated with Sketch.'); +insert into `dss_workflow_node` (`id`, `name`, `appconn_name`, `node_type`, `jump_url`, `support_jump`, `submit_to_scheduler`, `enable_copy`, `should_creation_before_node`, `icon`) values('5','hql','-1','linkis.hive.hql',NULL,'1','1','1','0',' hqlCreated with Sketch.'); +insert into `dss_workflow_node` (`id`, `name`, `appconn_name`, `node_type`, `jump_url`, `support_jump`, `submit_to_scheduler`, `enable_copy`, `should_creation_before_node`, `icon`) values('6','jdbc','-1','linkis.jdbc.jdbc',NULL,'1','1','1','0',''); +insert into `dss_workflow_node` (`id`, `name`, `appconn_name`, `node_type`, `jump_url`, `support_jump`, `submit_to_scheduler`, `enable_copy`, `should_creation_before_node`, `icon`) values('7','connector','-1','linkis.control.empty',NULL,'0','1','1','0','connectorCreated with Sketch.'); +insert into `dss_workflow_node` (`id`, `name`, `appconn_name`, `node_type`, `jump_url`, `support_jump`, `submit_to_scheduler`, `enable_copy`, `should_creation_before_node`, `icon`) values('8','sendemail','-1','linkis.appconn.sendemail',NULL,'0','1','1','0','sendemailCreated with Sketch.'); +insert into `dss_workflow_node` (`id`, `name`, `appconn_name`, `node_type`, `jump_url`, `support_jump`, `submit_to_scheduler`, `enable_copy`, `should_creation_before_node`, `icon`) values('9','eventsender','-1','linkis.appconn.eventchecker.eventsender',NULL,'0','1','1','0','eventsenderCreated with Sketch.'); +insert into `dss_workflow_node` (`id`, `name`, `appconn_name`, `node_type`, `jump_url`, `support_jump`, `submit_to_scheduler`, `enable_copy`, `should_creation_before_node`, `icon`) values('10','eventreceiver','-1','linkis.appconn.eventchecker.eventreceiver',NULL,'0','1','1','0','eventcheckerCreated with Sketch.'); +insert into `dss_workflow_node` (`id`, `name`, `appconn_name`, `node_type`, `jump_url`, `support_jump`, `submit_to_scheduler`, `enable_copy`, `should_creation_before_node`, `icon`) values('11','datachecker','-1','linkis.appconn.datachecker',NULL,'0','1','1','0','datacheckerCreated with Sketch.'); +insert into `dss_workflow_node` (`id`, `name`, `appconn_name`, `node_type`, `jump_url`, `support_jump`, `submit_to_scheduler`, `enable_copy`, `should_creation_before_node`, `icon`) values('12','subFlow','-1','workflow.subflow',NULL,'1','1','0','1','subflowCreated with Sketch.'); +insert into `dss_workflow_node` (`id`, `name`, `appconn_name`, `node_type`, `jump_url`, `support_jump`, `submit_to_scheduler`, `enable_copy`, `should_creation_before_node`, `icon`) values('15','shell','-1','linkis.shell.sh',NULL,'1','1','1','0','shellCreated with Sketch.'); + +DELETE FROM dss_workflow_node_group; +insert into `dss_workflow_node_group`(`id`,`name`,`name_en`,`description`,`order`) values (1,'数据交换','Data exchange',NULL,1); +insert into `dss_workflow_node_group`(`id`,`name`,`name_en`,`description`,`order`) values (2,'数据开发','Data development',NULL,2); +insert into `dss_workflow_node_group`(`id`,`name`,`name_en`,`description`,`order`) values (3,'数据质量','Data Givernance',NULL,3); +insert into `dss_workflow_node_group`(`id`,`name`,`name_en`,`description`,`order`) values (4,'数据可视化','Data visualization',NULL,4); +insert into `dss_workflow_node_group`(`id`,`name`,`name_en`,`description`,`order`) values (5,'数据输出','Data output',NULL,8); +insert into `dss_workflow_node_group`(`id`,`name`,`name_en`,`description`,`order`) values (6,'信号节点','Signal node',NULL,6); +insert into `dss_workflow_node_group`(`id`,`name`,`name_en`,`description`,`order`) values (7,'功能节点','Function node',NULL,7); +insert into `dss_workflow_node_group`(`id`,`name`,`name_en`,`description`,`order`) values (8,'机器学习','Machine Learning',NULL,5); + +DELETE FROM dss_workflow_node_to_group; +insert into `dss_workflow_node_to_group`(`node_id`,`group_id`) values (1,2); +insert into `dss_workflow_node_to_group`(`node_id`,`group_id`) values (2,2); +insert into `dss_workflow_node_to_group`(`node_id`,`group_id`) values (3,2); +insert into `dss_workflow_node_to_group`(`node_id`,`group_id`) values (4,2); +insert into `dss_workflow_node_to_group`(`node_id`,`group_id`) values (5,2); +insert into `dss_workflow_node_to_group`(`node_id`,`group_id`) values (6,2); +insert into `dss_workflow_node_to_group`(`node_id`,`group_id`) values (15,2); +insert into `dss_workflow_node_to_group`(`node_id`,`group_id`) values (16,3); +insert into `dss_workflow_node_to_group`(`node_id`,`group_id`) values (8,5); +insert into `dss_workflow_node_to_group`(`node_id`,`group_id`) values (9,6); +insert into `dss_workflow_node_to_group`(`node_id`,`group_id`) values (10,6); +insert into `dss_workflow_node_to_group`(`node_id`,`group_id`) values (11,6); +insert into `dss_workflow_node_to_group`(`node_id`,`group_id`) values (12,7); +insert into `dss_workflow_node_to_group`(`node_id`,`group_id`) values (7,7); +insert into `dss_workflow_node_to_group`(`node_id`,`group_id`) values (18,3); +insert into `dss_workflow_node_to_group`(`node_id`,`group_id`) values (19,3); +insert into `dss_workflow_node_to_group`(`node_id`,`group_id`) values (20,8); + +DELETE FROM dss_workflow_node_to_ui; +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (3,1); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (3,3); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (3,5); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (3,6); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (3,7); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (3,8); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (3,9); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (3,10); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (3,11); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (3,12); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (1,1); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (1,3); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (1,5); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (1,6); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (2,1); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (2,3); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (2,5); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (2,6); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (2,7); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (2,8); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (2,9); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (2,10); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (2,11); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (2,12); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (4,1); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (4,3); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (4,5); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (4,6); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (4,7); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (4,8); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (4,9); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (4,10); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (4,11); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (5,1); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (5,3); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (5,5); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (5,6); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (5,11); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (7,1); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (7,3); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (7,5); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (7,6); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (8,1); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (8,3); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (8,5); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (8,6); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (8,13); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (8,14); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (8,15); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (8,16); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (8,17); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (8,18); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (9,1); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (9,3); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (9,5); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (9,6); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (9,20); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (10,39); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (9,22); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (10,40); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (9,24); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (10,1); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (10,3); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (10,5); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (10,6); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (9,21); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (9,23); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (10,25); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (10,26); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (10,27); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (10,28); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (10,29); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (10,30); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (11,1); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (11,3); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (11,5); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (11,6); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (11,31); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (11,32); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (11,33); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (11,34); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (12,1); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (12,2); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (12,3); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (12,4); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (12,5); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (12,6); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (13,1); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (13,2); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (13,3); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (13,4); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (13,5); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (13,6); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (14,1); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (14,2); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (14,3); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (14,4); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (14,5); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (14,6); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (15,1); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (15,3); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (15,5); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (15,6); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (16,1); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (16,3); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (16,5); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (16,6); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (16,35); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (16,36); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (17,1); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (17,3); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (17,5); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (17,6); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (17,37); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (17,2); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (17,4); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (18,1); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (18,3); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (18,41); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (18,42); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (20,43); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (20,44); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (20,1); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (20,3); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (20,2); +insert into `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (20,4); +INSERT INTO `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) VALUES (1,45); +INSERT INTO `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) VALUES (2,45); +INSERT INTO `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) VALUES (3,45); +INSERT INTO `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) VALUES (4,45); +INSERT INTO `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) VALUES (5,45); +INSERT INTO `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) VALUES (6,45); +INSERT INTO `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) VALUES (7,45); +INSERT INTO `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) VALUES (8,45); +INSERT INTO `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) VALUES (9,45); +INSERT INTO `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) VALUES (10,45); +INSERT INTO `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) VALUES (11,45); +INSERT INTO `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) VALUES (12,45); +INSERT INTO `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) VALUES (13,45); +INSERT INTO `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) VALUES (14,45); +INSERT INTO `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) VALUES (15,45); +INSERT INTO `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) VALUES (17,45); +INSERT INTO `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) VALUES (20,45); + +DELETE FROM dss_workflow_node_ui; +insert into `dss_workflow_node_ui`(`id`,`key`,`description`,`description_en`,`lable_name`,`lable_name_en`,`ui_type`,`required`,`value`,`default_value`,`is_hidden`,`condition`,`is_advanced`,`order`,`node_menu_type`,`is_base_info`,`position`) values (1,'title','请填写节点名称','Please enter node name','节点名','Node name','Input',1,NULL,NULL,0,NULL,0,1,1,1,'node'); +insert into `dss_workflow_node_ui`(`id`,`key`,`description`,`description_en`,`lable_name`,`lable_name_en`,`ui_type`,`required`,`value`,`default_value`,`is_hidden`,`condition`,`is_advanced`,`order`,`node_menu_type`,`is_base_info`,`position`) values (2,'title','请填写节点名称','Please enter node name','节点名','Node name','Input',1,NULL,NULL,0,NULL,0,1,0,1,'node'); +insert into `dss_workflow_node_ui`(`id`,`key`,`description`,`description_en`,`lable_name`,`lable_name_en`,`ui_type`,`required`,`value`,`default_value`,`is_hidden`,`condition`,`is_advanced`,`order`,`node_menu_type`,`is_base_info`,`position`) values (3,'desc','请填写节点描述','Please enter the node description','节点描述','Node description','Text',0,NULL,NULL,0,NULL,0,4,1,1,'node'); +insert into `dss_workflow_node_ui`(`id`,`key`,`description`,`description_en`,`lable_name`,`lable_name_en`,`ui_type`,`required`,`value`,`default_value`,`is_hidden`,`condition`,`is_advanced`,`order`,`node_menu_type`,`is_base_info`,`position`) values (4,'desc','请填写节点描述','Please enter the node description','节点描述','Node description','Text',0,NULL,NULL,0,NULL,0,2,0,1,'node'); +insert into `dss_workflow_node_ui`(`id`,`key`,`description`,`description_en`,`lable_name`,`lable_name_en`,`ui_type`,`required`,`value`,`default_value`,`is_hidden`,`condition`,`is_advanced`,`order`,`node_menu_type`,`is_base_info`,`position`) values (5,'businessTag',NULL,NULL,'业务标签','businessTag','Tag',0,NULL,NULL,0,NULL,0,2,1,1,'node'); +insert into `dss_workflow_node_ui`(`id`,`key`,`description`,`description_en`,`lable_name`,`lable_name_en`,`ui_type`,`required`,`value`,`default_value`,`is_hidden`,`condition`,`is_advanced`,`order`,`node_menu_type`,`is_base_info`,`position`) values (6,'appTag',NULL,NULL,'应用标签','appTag','Tag',0,NULL,NULL,0,NULL,0,3,1,1,'node'); +insert into `dss_workflow_node_ui`(`id`,`key`,`description`,`description_en`,`lable_name`,`lable_name_en`,`ui_type`,`required`,`value`,`default_value`,`is_hidden`,`condition`,`is_advanced`,`order`,`node_menu_type`,`is_base_info`,`position`) values (7,'spark.driver.memory','驱动器内存大小,默认值:2','Driver memory, default value: 2','spark-driver-memory','spark-driver-memory','Input',0,NULL,'2',0,NULL,0,1,1,0,'startup'); +insert into `dss_workflow_node_ui`(`id`,`key`,`description`,`description_en`,`lable_name`,`lable_name_en`,`ui_type`,`required`,`value`,`default_value`,`is_hidden`,`condition`,`is_advanced`,`order`,`node_menu_type`,`is_base_info`,`position`) values (8,'spark.executor.memory','执行器内存大小,默认值:3','Executor memory, default value: 3','spark-executor-memory','spark-executor-memory','Input',0,NULL,'3',0,NULL,0,1,1,0,'startup'); +insert into `dss_workflow_node_ui`(`id`,`key`,`description`,`description_en`,`lable_name`,`lable_name_en`,`ui_type`,`required`,`value`,`default_value`,`is_hidden`,`condition`,`is_advanced`,`order`,`node_menu_type`,`is_base_info`,`position`) values (9,'spark.executor.cores','执行器核心个数,默认值:1','Number of cores per executor, default value: 1','spark-executor-cores','spark-executor-cores','Input',0,NULL,'2',0,NULL,0,1,1,0,'startup'); +insert into `dss_workflow_node_ui`(`id`,`key`,`description`,`description_en`,`lable_name`,`lable_name_en`,`ui_type`,`required`,`value`,`default_value`,`is_hidden`,`condition`,`is_advanced`,`order`,`node_menu_type`,`is_base_info`,`position`) values (10,'spark.executor.instances','执行器个数,默认值:2','Number of executors, default value: 2','spark-executor-instances','spark-executor-instances','Input',0,NULL,'2',0,NULL,0,1,1,0,'startup'); +insert into `dss_workflow_node_ui`(`id`,`key`,`description`,`description_en`,`lable_name`,`lable_name_en`,`ui_type`,`required`,`value`,`default_value`,`is_hidden`,`condition`,`is_advanced`,`order`,`node_menu_type`,`is_base_info`,`position`) values (11,'wds.linkis.rm.yarnqueue','执行队列','Execution queue','wds-linkis-yarnqueue','wds-linkis-yarnqueue','Input',0,NULL,'dws',0,NULL,0,1,1,0,'startup'); +insert into `dss_workflow_node_ui`(`id`,`key`,`description`,`description_en`,`lable_name`,`lable_name_en`,`ui_type`,`required`,`value`,`default_value`,`is_hidden`,`condition`,`is_advanced`,`order`,`node_menu_type`,`is_base_info`,`position`) values (12,'resources',NULL,NULL,'资源信息','Resource information','Upload',0,'[]',NULL,0,NULL,0,1,1,0,'node'); +insert into `dss_workflow_node_ui`(`id`,`key`,`description`,`description_en`,`lable_name`,`lable_name_en`,`ui_type`,`required`,`value`,`default_value`,`is_hidden`,`condition`,`is_advanced`,`order`,`node_menu_type`,`is_base_info`,`position`) values (13,'category','请选择类型','Please choose the type','类型','Type','Select',1,'[\"node\"]','node',0,NULL,0,1,1,0,'runtime'); +insert into `dss_workflow_node_ui`(`id`,`key`,`description`,`description_en`,`lable_name`,`lable_name_en`,`ui_type`,`required`,`value`,`default_value`,`is_hidden`,`condition`,`is_advanced`,`order`,`node_menu_type`,`is_base_info`,`position`) values (14,'subject','请填写邮件标题','Please enter the email subject','邮件标题','Email Subject','Input',1,NULL,NULL,0,NULL,0,1,1,0,'runtime'); +insert into `dss_workflow_node_ui`(`id`,`key`,`description`,`description_en`,`lable_name`,`lable_name_en`,`ui_type`,`required`,`value`,`default_value`,`is_hidden`,`condition`,`is_advanced`,`order`,`node_menu_type`,`is_base_info`,`position`) values (15,'content','请选择或输入发送项','Please choose or enter the items to send','发送项','Intems to Send','MultiBinding',1,'[\"linkis.appconn.visualis.display\",\"linkis.appconn.visualis.dashboard\"]','[]',0,NULL,0,1,1,0,'runtime'); +insert into `dss_workflow_node_ui`(`id`,`key`,`description`,`description_en`,`lable_name`,`lable_name_en`,`ui_type`,`required`,`value`,`default_value`,`is_hidden`,`condition`,`is_advanced`,`order`,`node_menu_type`,`is_base_info`,`position`) values (16,'to','请填写收件人','Please enter recipients','收件人','To','Input',1,NULL,NULL,0,NULL,0,1,1,0,'runtime'); +insert into `dss_workflow_node_ui`(`id`,`key`,`description`,`description_en`,`lable_name`,`lable_name_en`,`ui_type`,`required`,`value`,`default_value`,`is_hidden`,`condition`,`is_advanced`,`order`,`node_menu_type`,`is_base_info`,`position`) values (17,'cc','请填写抄送人','Please enter carbon copy recipients','抄送','Cc','Input',0,NULL,NULL,0,NULL,0,1,1,0,'runtime'); +insert into `dss_workflow_node_ui`(`id`,`key`,`description`,`description_en`,`lable_name`,`lable_name_en`,`ui_type`,`required`,`value`,`default_value`,`is_hidden`,`condition`,`is_advanced`,`order`,`node_menu_type`,`is_base_info`,`position`) values (18,'bcc','请填写秘密发送人','Please enter blind carbon copy recipients','秘密抄送','Bcc','Input',0,NULL,NULL,0,NULL,0,1,1,0,'runtime'); +insert into `dss_workflow_node_ui`(`id`,`key`,`description`,`description_en`,`lable_name`,`lable_name_en`,`ui_type`,`required`,`value`,`default_value`,`is_hidden`,`condition`,`is_advanced`,`order`,`node_menu_type`,`is_base_info`,`position`) values (19,'itsm','请填写关联审批单','Please enter ITSM','关联审批单','ITSM','Input',1,NULL,NULL,0,NULL,0,1,1,0,'runtime'); +insert into `dss_workflow_node_ui`(`id`,`key`,`description`,`description_en`,`lable_name`,`lable_name_en`,`ui_type`,`required`,`value`,`default_value`,`is_hidden`,`condition`,`is_advanced`,`order`,`node_menu_type`,`is_base_info`,`position`) values (20,'msg.type','请正确填写消息类型','Please enter message type correctly','msg.type','msg.type','Disable',1,NULL,'SEND',0,NULL,0,1,1,0,'runtime'); +insert into `dss_workflow_node_ui`(`id`,`key`,`description`,`description_en`,`lable_name`,`lable_name_en`,`ui_type`,`required`,`value`,`default_value`,`is_hidden`,`condition`,`is_advanced`,`order`,`node_menu_type`,`is_base_info`,`position`) values (21,'msg.topic','消息主题,必须与eventreceiver完全一致','Message subject must be exactly the same as eventreceiver','msg.topic','msg.topic','Input',1,NULL,NULL,0,NULL,0,1,1,0,'runtime'); +insert into `dss_workflow_node_ui`(`id`,`key`,`description`,`description_en`,`lable_name`,`lable_name_en`,`ui_type`,`required`,`value`,`default_value`,`is_hidden`,`condition`,`is_advanced`,`order`,`node_menu_type`,`is_base_info`,`position`) values (22,'msg.sender','请正确填写发送者','Please enter the sender correctly','msg.sender','msg.sender','Input',1,NULL,NULL,0,NULL,0,1,1,0,'runtime'); +insert into `dss_workflow_node_ui`(`id`,`key`,`description`,`description_en`,`lable_name`,`lable_name_en`,`ui_type`,`required`,`value`,`default_value`,`is_hidden`,`condition`,`is_advanced`,`order`,`node_menu_type`,`is_base_info`,`position`) values (23,'msg.name','消息名称,必须与eventreceiver完全一致','The message name must be exactly the same as the eventreceiver','msg.name','msg.name','Input',1,NULL,NULL,0,NULL,0,1,1,0,'runtime'); +insert into `dss_workflow_node_ui`(`id`,`key`,`description`,`description_en`,`lable_name`,`lable_name_en`,`ui_type`,`required`,`value`,`default_value`,`is_hidden`,`condition`,`is_advanced`,`order`,`node_menu_type`,`is_base_info`,`position`) values (24,'msg.body','请正确填写消息内容','Please enter the message content correctly','msg.body','msg.body','Text',0,NULL,NULL,0,NULL,0,1,1,0,'runtime'); +insert into `dss_workflow_node_ui`(`id`,`key`,`description`,`description_en`,`lable_name`,`lable_name_en`,`ui_type`,`required`,`value`,`default_value`,`is_hidden`,`condition`,`is_advanced`,`order`,`node_menu_type`,`is_base_info`,`position`) values (25,'msg.type','请正确填写消息类型','Please enter message type correctly','msg.type','msg.type','Disable',1,NULL,'RECEIVE',0,NULL,0,1,1,0,'runtime'); +insert into `dss_workflow_node_ui`(`id`,`key`,`description`,`description_en`,`lable_name`,`lable_name_en`,`ui_type`,`required`,`value`,`default_value`,`is_hidden`,`condition`,`is_advanced`,`order`,`node_menu_type`,`is_base_info`,`position`) values (26,'msg.receiver','请正确填写消息接收者','Please enter message recipients correctly','msg.receiver','msg.receiver','Input',1,NULL,NULL,0,NULL,0,1,1,0,'runtime'); +insert into `dss_workflow_node_ui`(`id`,`key`,`description`,`description_en`,`lable_name`,`lable_name_en`,`ui_type`,`required`,`value`,`default_value`,`is_hidden`,`condition`,`is_advanced`,`order`,`node_menu_type`,`is_base_info`,`position`) values (27,'query.frequency','请填写查询频率,默认10次','Please enter query frequency, 10 times by default','query.frequency','query.frequency','Disable',0,NULL,'10',0,NULL,0,1,1,0,'runtime'); +insert into `dss_workflow_node_ui`(`id`,`key`,`description`,`description_en`,`lable_name`,`lable_name_en`,`ui_type`,`required`,`value`,`default_value`,`is_hidden`,`condition`,`is_advanced`,`order`,`node_menu_type`,`is_base_info`,`position`) values (28,'max.receive.hours','请填写等待时间,默认1小时','Please enter waiting time, 1 hour by default','max.receive.hours','max.receive.hours','Input',0,NULL,'12',0,NULL,0,1,1,0,'runtime'); +insert into `dss_workflow_node_ui`(`id`,`key`,`description`,`description_en`,`lable_name`,`lable_name_en`,`ui_type`,`required`,`value`,`default_value`,`is_hidden`,`condition`,`is_advanced`,`order`,`node_menu_type`,`is_base_info`,`position`) values (29,'msg.savekey','消息共享key值,默认msg.body','The ky of message content, msg.body by default','msg.savekey','msg.savekey','Input',0,NULL,'msg.body',0,NULL,0,1,1,0,'runtime'); +insert into `dss_workflow_node_ui`(`id`,`key`,`description`,`description_en`,`lable_name`,`lable_name_en`,`ui_type`,`required`,`value`,`default_value`,`is_hidden`,`condition`,`is_advanced`,`order`,`node_menu_type`,`is_base_info`,`position`) values (30,'only.receive.today',NULL,NULL,'only.receive.today','only.receive.today','Input',0,NULL,'true',0,NULL,0,1,1,0,'runtime'); +insert into `dss_workflow_node_ui`(`id`,`key`,`description`,`description_en`,`lable_name`,`lable_name_en`,`ui_type`,`required`,`value`,`default_value`,`is_hidden`,`condition`,`is_advanced`,`order`,`node_menu_type`,`is_base_info`,`position`) values (31,'source.type','请选择数据来源','Please choose the data source','source.type','source.type','Select',1,'[\"hivedb\",\"maskdb\"]',NULL,0,NULL,0,1,1,0,'runtime'); +insert into `dss_workflow_node_ui`(`id`,`key`,`description`,`description_en`,`lable_name`,`lable_name_en`,`ui_type`,`required`,`value`,`default_value`,`is_hidden`,`condition`,`is_advanced`,`order`,`node_menu_type`,`is_base_info`,`position`) values (32,'check.object','比如:db.tb{ds=${run_date}}','Please enter the name of data dependency','check.object','check.object','Input',1,NULL,NULL,0,NULL,0,1,1,0,'runtime'); +insert into `dss_workflow_node_ui`(`id`,`key`,`description`,`description_en`,`lable_name`,`lable_name_en`,`ui_type`,`required`,`value`,`default_value`,`is_hidden`,`condition`,`is_advanced`,`order`,`node_menu_type`,`is_base_info`,`position`) values (33,'max.check.hours',NULL,NULL,'max.check.hours','max.check.hours','Input',0,NULL,'1',0,NULL,0,1,1,0,'runtime'); +insert into `dss_workflow_node_ui`(`id`,`key`,`description`,`description_en`,`lable_name`,`lable_name_en`,`ui_type`,`required`,`value`,`default_value`,`is_hidden`,`condition`,`is_advanced`,`order`,`node_menu_type`,`is_base_info`,`position`) values (34,'job.desc','请正确填写多源配置','Please enter multi-source configuration correctly','job.desc','job.desc','Text',0,NULL,NULL,0,NULL,0,1,1,0,'runtime'); +insert into `dss_workflow_node_ui`(`id`,`key`,`description`,`description_en`,`lable_name`,`lable_name_en`,`ui_type`,`required`,`value`,`default_value`,`is_hidden`,`condition`,`is_advanced`,`order`,`node_menu_type`,`is_base_info`,`position`) values (35,'filter',NULL,NULL,'过滤条件','Filter','Input',0,NULL,NULL,0,NULL,0,1,1,0,'runtime'); +insert into `dss_workflow_node_ui`(`id`,`key`,`description`,`description_en`,`lable_name`,`lable_name_en`,`ui_type`,`required`,`value`,`default_value`,`is_hidden`,`condition`,`is_advanced`,`order`,`node_menu_type`,`is_base_info`,`position`) values (36,'executeUser','必填','Required','执行用户','Executor','Input',1,NULL,NULL,0,NULL,0,1,1,0,'runtime'); +insert into `dss_workflow_node_ui`(`id`,`key`,`description`,`description_en`,`lable_name`,`lable_name_en`,`ui_type`,`required`,`value`,`default_value`,`is_hidden`,`condition`,`is_advanced`,`order`,`node_menu_type`,`is_base_info`,`position`) values (37,'bindViewKey','请选择上游节点','Please select upstream node','绑定上游节点','Bind front node','Binding',1,'[\"*\"]','empty',0,NULL,0,3,0,0,'node'); +insert into `dss_workflow_node_ui`(`id`,`key`,`description`,`description_en`,`lable_name`,`lable_name_en`,`ui_type`,`required`,`value`,`default_value`,`is_hidden`,`condition`,`is_advanced`,`order`,`node_menu_type`,`is_base_info`,`position`) values (39,'msg.topic','消息主题,必须与eventsender完全一致','Message subject must be exactly the same as eventsender','msg.topic','msg.topic','Input',1,NULL,NULL,0,NULL,0,1,1,0,'runtime'); +insert into `dss_workflow_node_ui`(`id`,`key`,`description`,`description_en`,`lable_name`,`lable_name_en`,`ui_type`,`required`,`value`,`default_value`,`is_hidden`,`condition`,`is_advanced`,`order`,`node_menu_type`,`is_base_info`,`position`) values (40,'msg.name','消息名称,必须与eventsender完全一致','The message name must be exactly the same as the eventsender ','msg.name','msg.name','Input',1,NULL,NULL,0,NULL,0,1,1,0,'runtime'); +insert into `dss_workflow_node_ui`(`id`,`key`,`description`,`description_en`,`lable_name`,`lable_name_en`,`ui_type`,`required`,`value`,`default_value`,`is_hidden`,`condition`,`is_advanced`,`order`,`node_menu_type`,`is_base_info`,`position`) values (41,'executeUser','请填写执行用户','Please enter execute user','执行用户','executeUser','Input',1,NULL,NULL,0,NULL,0,1,1,0,'runtime'); +insert into `dss_workflow_node_ui`(`id`,`key`,`description`,`description_en`,`lable_name`,`lable_name_en`,`ui_type`,`required`,`value`,`default_value`,`is_hidden`,`condition`,`is_advanced`,`order`,`node_menu_type`,`is_base_info`,`position`) values (42,'Filter','请填写过滤条件','Please enter filter','过滤条件','Filter','Input',1,NULL,NULL,0,NULL,0,1,1,0,'runtime'); +insert into `dss_workflow_node_ui`(`id`,`key`,`description`,`description_en`,`lable_name`,`lable_name_en`,`ui_type`,`required`,`value`,`default_value`,`is_hidden`,`condition`,`is_advanced`,`order`,`node_menu_type`,`is_base_info`,`position`) values (43,'executeUser','请填写执行用户','Please enter execute user','执行用户','executeUser','Input',1,NULL,NULL,0,NULL,0,1,1,0,'runtime'); +insert into `dss_workflow_node_ui`(`id`,`key`,`description`,`description_en`,`lable_name`,`lable_name_en`,`ui_type`,`required`,`value`,`default_value`,`is_hidden`,`condition`,`is_advanced`,`order`,`node_menu_type`,`is_base_info`,`position`) values (44,'Filter','请填写过滤条件','Please enter filter','过滤条件','Filter','Input',0,NULL,NULL,0,NULL,0,1,1,0,'runtime'); +INSERT INTO `dss_workflow_node_ui`(`id`,`key`,`description`,`description_en`,`lable_name`,`lable_name_en`,`ui_type`,`required`,`value`,`default_value`,`is_hidden`,`condition`,`is_advanced`,`order`,`node_menu_type`,`is_base_info`,`position`) values (45,'ReuseEngine','请选择是否复用引擎','Please choose to reuse engin or not','是否复用引擎','reuse-engine-or-not','Select',1,'[\"true\",\"false\"]','true',0,NULL,0,1,1,0,'startup'); + + +DELETE FROM dss_workflow_node_ui_to_validate; +insert into `dss_workflow_node_ui_to_validate`(`ui_id`,`validate_id`) values (31,31); +insert into `dss_workflow_node_ui_to_validate`(`ui_id`,`validate_id`) values (32,32); +insert into `dss_workflow_node_ui_to_validate`(`ui_id`,`validate_id`) values (32,40); +insert into `dss_workflow_node_ui_to_validate`(`ui_id`,`validate_id`) values (32,41); +insert into `dss_workflow_node_ui_to_validate`(`ui_id`,`validate_id`) values (28,28); +insert into `dss_workflow_node_ui_to_validate`(`ui_id`,`validate_id`) values (27,27); +insert into `dss_workflow_node_ui_to_validate`(`ui_id`,`validate_id`) values (3,32); +insert into `dss_workflow_node_ui_to_validate`(`ui_id`,`validate_id`) values (3,41); +insert into `dss_workflow_node_ui_to_validate`(`ui_id`,`validate_id`) values (4,32); +insert into `dss_workflow_node_ui_to_validate`(`ui_id`,`validate_id`) values (4,41); +insert into `dss_workflow_node_ui_to_validate`(`ui_id`,`validate_id`) values (22,44); +insert into `dss_workflow_node_ui_to_validate`(`ui_id`,`validate_id`) values (22,45); +insert into `dss_workflow_node_ui_to_validate`(`ui_id`,`validate_id`) values (22,41); +insert into `dss_workflow_node_ui_to_validate`(`ui_id`,`validate_id`) values (25,25); +insert into `dss_workflow_node_ui_to_validate`(`ui_id`,`validate_id`) values (26,32); +insert into `dss_workflow_node_ui_to_validate`(`ui_id`,`validate_id`) values (26,41); +insert into `dss_workflow_node_ui_to_validate`(`ui_id`,`validate_id`) values (21,32); +insert into `dss_workflow_node_ui_to_validate`(`ui_id`,`validate_id`) values (21,41); +insert into `dss_workflow_node_ui_to_validate`(`ui_id`,`validate_id`) values (21,46); +insert into `dss_workflow_node_ui_to_validate`(`ui_id`,`validate_id`) values (29,32); +insert into `dss_workflow_node_ui_to_validate`(`ui_id`,`validate_id`) values (29,41); +insert into `dss_workflow_node_ui_to_validate`(`ui_id`,`validate_id`) values (23,47); +insert into `dss_workflow_node_ui_to_validate`(`ui_id`,`validate_id`) values (23,48); +insert into `dss_workflow_node_ui_to_validate`(`ui_id`,`validate_id`) values (24,32); +insert into `dss_workflow_node_ui_to_validate`(`ui_id`,`validate_id`) values (24,41); +insert into `dss_workflow_node_ui_to_validate`(`ui_id`,`validate_id`) values (7,7); +insert into `dss_workflow_node_ui_to_validate`(`ui_id`,`validate_id`) values (8,8); +insert into `dss_workflow_node_ui_to_validate`(`ui_id`,`validate_id`) values (9,9); +insert into `dss_workflow_node_ui_to_validate`(`ui_id`,`validate_id`) values (10,10); +insert into `dss_workflow_node_ui_to_validate`(`ui_id`,`validate_id`) values (11,41); +insert into `dss_workflow_node_ui_to_validate`(`ui_id`,`validate_id`) values (11,57); +insert into `dss_workflow_node_ui_to_validate`(`ui_id`,`validate_id`) values (1,48); +insert into `dss_workflow_node_ui_to_validate`(`ui_id`,`validate_id`) values (1,47); +insert into `dss_workflow_node_ui_to_validate`(`ui_id`,`validate_id`) values (2,48); +insert into `dss_workflow_node_ui_to_validate`(`ui_id`,`validate_id`) values (2,47); +insert into `dss_workflow_node_ui_to_validate`(`ui_id`,`validate_id`) values (13,13); +insert into `dss_workflow_node_ui_to_validate`(`ui_id`,`validate_id`) values (14,47); +insert into `dss_workflow_node_ui_to_validate`(`ui_id`,`validate_id`) values (16,50); +insert into `dss_workflow_node_ui_to_validate`(`ui_id`,`validate_id`) values (17,50); +insert into `dss_workflow_node_ui_to_validate`(`ui_id`,`validate_id`) values (18,50); +insert into `dss_workflow_node_ui_to_validate`(`ui_id`,`validate_id`) values (19,51); +insert into `dss_workflow_node_ui_to_validate`(`ui_id`,`validate_id`) values (19,47); +insert into `dss_workflow_node_ui_to_validate`(`ui_id`,`validate_id`) values (5,52); +insert into `dss_workflow_node_ui_to_validate`(`ui_id`,`validate_id`) values (6,52); +insert into `dss_workflow_node_ui_to_validate`(`ui_id`,`validate_id`) values (12,52); +insert into `dss_workflow_node_ui_to_validate`(`ui_id`,`validate_id`) values (20,53); +insert into `dss_workflow_node_ui_to_validate`(`ui_id`,`validate_id`) values (36,52); +insert into `dss_workflow_node_ui_to_validate`(`ui_id`,`validate_id`) values (30,52); +insert into `dss_workflow_node_ui_to_validate`(`ui_id`,`validate_id`) values (33,54); +insert into `dss_workflow_node_ui_to_validate`(`ui_id`,`validate_id`) values (34,32); +insert into `dss_workflow_node_ui_to_validate`(`ui_id`,`validate_id`) values (35,52); +insert into `dss_workflow_node_ui_to_validate`(`ui_id`,`validate_id`) values (31,55); +insert into `dss_workflow_node_ui_to_validate`(`ui_id`,`validate_id`) values (32,55); +insert into `dss_workflow_node_ui_to_validate`(`ui_id`,`validate_id`) values (34,41); +insert into `dss_workflow_node_ui_to_validate`(`ui_id`,`validate_id`) values (22,55); +insert into `dss_workflow_node_ui_to_validate`(`ui_id`,`validate_id`) values (20,55); +insert into `dss_workflow_node_ui_to_validate`(`ui_id`,`validate_id`) values (26,55); +insert into `dss_workflow_node_ui_to_validate`(`ui_id`,`validate_id`) values (21,55); +insert into `dss_workflow_node_ui_to_validate`(`ui_id`,`validate_id`) values (25,55); +insert into `dss_workflow_node_ui_to_validate`(`ui_id`,`validate_id`) values (1,56); +insert into `dss_workflow_node_ui_to_validate`(`ui_id`,`validate_id`) values (2,56); +insert into `dss_workflow_node_ui_to_validate`(`ui_id`,`validate_id`) values (13,55); +insert into `dss_workflow_node_ui_to_validate`(`ui_id`,`validate_id`) values (16,55); +insert into `dss_workflow_node_ui_to_validate`(`ui_id`,`validate_id`) values (19,55); +insert into `dss_workflow_node_ui_to_validate`(`ui_id`,`validate_id`) values (36,55); +insert into `dss_workflow_node_ui_to_validate`(`ui_id`,`validate_id`) values (14,55); +insert into `dss_workflow_node_ui_to_validate`(`ui_id`,`validate_id`) values (37,55); +insert into `dss_workflow_node_ui_to_validate`(`ui_id`,`validate_id`) values (1,55); +insert into `dss_workflow_node_ui_to_validate`(`ui_id`,`validate_id`) values (2,55); +insert into `dss_workflow_node_ui_to_validate`(`ui_id`,`validate_id`) values (39,32); +insert into `dss_workflow_node_ui_to_validate`(`ui_id`,`validate_id`) values (39,41); +insert into `dss_workflow_node_ui_to_validate`(`ui_id`,`validate_id`) values (39,46); +insert into `dss_workflow_node_ui_to_validate`(`ui_id`,`validate_id`) values (39,55); +insert into `dss_workflow_node_ui_to_validate`(`ui_id`,`validate_id`) values (40,47); +insert into `dss_workflow_node_ui_to_validate`(`ui_id`,`validate_id`) values (40,48); +insert into `dss_workflow_node_ui_to_validate`(`ui_id`,`validate_id`) values (23,55); +insert into `dss_workflow_node_ui_to_validate`(`ui_id`,`validate_id`) values (40,55); +insert into `dss_workflow_node_ui_to_validate`(`ui_id`,`validate_id`) values (15,55); +insert into `dss_workflow_node_ui_to_validate`(`ui_id`,`validate_id`) values (22,58); +insert into `dss_workflow_node_ui_to_validate`(`ui_id`,`validate_id`) values (26,58); +insert into `dss_workflow_node_ui_to_validate`(`ui_id`,`validate_id`) values (41,55); +insert into `dss_workflow_node_ui_to_validate`(`ui_id`,`validate_id`) values (43,55); +insert into `dss_workflow_node_ui_to_validate`(`ui_id`,`validate_id`) values (44,52); +insert into `dss_workflow_node_ui_to_validate`(`ui_id`,`validate_id`) values (42,52); +INSERT INTO `dss_workflow_node_ui_to_validate`(`ui_id`,`validate_id`) VALUES (45,59); + +DELETE FROM dss_workflow_node_ui_validate; +insert into `dss_workflow_node_ui_validate` (`id`, `validate_type`, `validate_range`, `error_msg`, `error_msg_en`, `trigger`) values('7','NumInterval','[1,15]','驱动器内存大小,默认值:2','Drive memory size, default value: 2','blur'); +insert into `dss_workflow_node_ui_validate` (`id`, `validate_type`, `validate_range`, `error_msg`, `error_msg_en`, `trigger`) values('8','NumInterval','[3,15]','执行器内存大小,默认值:3','Actuator memory size, default value: 3','blur'); +insert into `dss_workflow_node_ui_validate` (`id`, `validate_type`, `validate_range`, `error_msg`, `error_msg_en`, `trigger`) values('9','NumInterval','[1,10]','执行器核心个数,默认值:1','Number of cores per executor, default value : 1','blur'); +insert into `dss_workflow_node_ui_validate` (`id`, `validate_type`, `validate_range`, `error_msg`, `error_msg_en`, `trigger`) values('10','NumInterval','[1,40]','执行器个数,默认值:2','Number of per executor, default value : 2','blur'); +insert into `dss_workflow_node_ui_validate` (`id`, `validate_type`, `validate_range`, `error_msg`, `error_msg_en`, `trigger`) values('13','OFT','[\"node\"]','请选择类型','Please select type','change'); +insert into `dss_workflow_node_ui_validate` (`id`, `validate_type`, `validate_range`, `error_msg`, `error_msg_en`, `trigger`) values('25','OFT','[\"RECEIVE\"]','','Please select ','change'); +insert into `dss_workflow_node_ui_validate` (`id`, `validate_type`, `validate_range`, `error_msg`, `error_msg_en`, `trigger`) values('27','NumInterval','[1,1000]','请填写查询频率,默认10次,范围:1-1000','Please fill in the inquiry frequency, default : 10, range is 1 to 1000','blur'); +insert into `dss_workflow_node_ui_validate` (`id`, `validate_type`, `validate_range`, `error_msg`, `error_msg_en`, `trigger`) values('28','NumInterval','[1,1000]','请填写等待时间,默认1小时,范围:1-1000','Please enter waiting time, 1 hour by default, range is 1 to 1000','blur'); +insert into `dss_workflow_node_ui_validate` (`id`, `validate_type`, `validate_range`, `error_msg`, `error_msg_en`, `trigger`) values('31','OFT','[\"hivedb\",\"maskdb\"]','','Invalid format,example:ProjectName@WFName@jobName','blur'); +insert into `dss_workflow_node_ui_validate` (`id`, `validate_type`, `validate_range`, `error_msg`, `error_msg_en`, `trigger`) values('32','Regex','^[^\\u4e00-\\u9fa5]+$','此值不能输入中文','Chinese characters are not allowed','blur'); +insert into `dss_workflow_node_ui_validate` (`id`, `validate_type`, `validate_range`, `error_msg`, `error_msg_en`, `trigger`) values('40','Regex','^[a-zA-Z]([^.]*\\.[^.]*){1,}$','需要检查的数据源dbname.tablename{partition}','Checked data source dbname.tablename{partition}','blur'); +insert into `dss_workflow_node_ui_validate` (`id`, `validate_type`, `validate_range`, `error_msg`, `error_msg_en`, `trigger`) values('41','Regex','^.{1,500}$','长度在1到500个字符','The length is between 1 and 500 characters','blur'); +insert into `dss_workflow_node_ui_validate` (`id`, `validate_type`, `validate_range`, `error_msg`, `error_msg_en`, `trigger`) values('44','Regex','^[a-zA-Z][a-zA-Z0-9_@-]*$','必须以字母开头,且只支持字母、数字、下划线、@、中横线','Started with alphabetic characters, only alphanumeric characters, underscore(_), @ and hyphen(-) are allowed','blur'); +insert into `dss_workflow_node_ui_validate` (`id`, `validate_type`, `validate_range`, `error_msg`, `error_msg_en`, `trigger`) values('45','Regex','^[a-zA-Z0_9-]([^@]*@[^@]*){2}[a-zA-Z\\d]$','此值格式错误,例如:ProjectName@WFName@jobName','Invalid format,example:ProjectName@WFName@jobName','blur'); +insert into `dss_workflow_node_ui_validate` (`id`, `validate_type`, `validate_range`, `error_msg`, `error_msg_en`, `trigger`) values('46','Regex','^[a-zA-Z0_9-]([^_]*_[^_]*){2}[a-zA-Z\\d]$','此值格式错误,例如:bdp_tac_name','Invalid format,example:bdp_tac_name','blur'); +insert into `dss_workflow_node_ui_validate` (`id`, `validate_type`, `validate_range`, `error_msg`, `error_msg_en`, `trigger`) values('47','Regex','^.{1,128}$','长度在1到128个字符','The length is between 1 and 128 characters','blur'); +insert into `dss_workflow_node_ui_validate` (`id`, `validate_type`, `validate_range`, `error_msg`, `error_msg_en`, `trigger`) values('48','Regex','^[a-zA-Z][a-zA-Z0-9_-]*$','必须以字母开头,且只支持字母、数字、下划线!','Started with alphabetic characters, only alphanumeric and underscore are allowed!','blur'); +insert into `dss_workflow_node_ui_validate` (`id`, `validate_type`, `validate_range`, `error_msg`, `error_msg_en`, `trigger`) values('49','Regex','^[a-zA-Z0-9_\\u4e00-\\u9fa5]*$','只支持中文、字母、数字和下划线!','Only Chinese characters, alphanumeric characters and underscore are allowed in subject!','blur'); +insert into `dss_workflow_node_ui_validate` (`id`, `validate_type`, `validate_range`, `error_msg`, `error_msg_en`, `trigger`) values('50','Regex','^[a-z][a-zA-Z0-9_.@;]*$','必须以字母开头,且只支持字母、数字、下划线、@、点','Must start with a letter and only letters, numbers, underscores, @, points are supported','blur'); +insert into `dss_workflow_node_ui_validate` (`id`, `validate_type`, `validate_range`, `error_msg`, `error_msg_en`, `trigger`) values('51','Regex','^[0-9_.]*$','只支持数字、下划线、点','Only numbers, underscores and dots are supported','blur'); +insert into `dss_workflow_node_ui_validate` (`id`, `validate_type`, `validate_range`, `error_msg`, `error_msg_en`, `trigger`) values('52','None',NULL,NULL,NULL,'blur'); +insert into `dss_workflow_node_ui_validate` (`id`, `validate_type`, `validate_range`, `error_msg`, `error_msg_en`, `trigger`) values('53','OFT','[\"SEND\"]',NULL,NULL,'change'); +insert into `dss_workflow_node_ui_validate` (`id`, `validate_type`, `validate_range`, `error_msg`, `error_msg_en`, `trigger`) values('54','NumInterval','[1,1000]','请填写等待时间,默认1小时,范围:1-1000','Please fill in the waiting time, default 1 hour, range: 1-1000','blur'); +insert into `dss_workflow_node_ui_validate` (`id`, `validate_type`, `validate_range`, `error_msg`, `error_msg_en`, `trigger`) values('55','Required',NULL,'该值不能为空','The value cannot be empty\n\n','change'); +insert into `dss_workflow_node_ui_validate` (`id`, `validate_type`, `validate_range`, `error_msg`, `error_msg_en`, `trigger`) values('56','Function','validatorTitle','节点名不能和工作流名一样','The node name cannot be the same as the workflow name',NULL); +insert into `dss_workflow_node_ui_validate` (`id`, `validate_type`, `validate_range`, `error_msg`, `error_msg_en`, `trigger`) values('57','Regex','^[a-zA-Z][a-zA-Z0-9_.-]*$','必须以字母开头,且只支持字母、数字、下划线、点!','It must start with a letter and only supports letters, numbers, underscores and dots!','blur'); +insert into `dss_workflow_node_ui_validate` (`id`, `validate_type`, `validate_range`, `error_msg`, `error_msg_en`, `trigger`) values('58','Regex','(.+)@(.+)@(.+)','此格式错误,例如:ProjectName@WFName@jobName','Invalid format,example:ProjectName@WFName@jobName','blur'); +INSERT INTO `dss_workflow_node_ui_validate` (`id`, `validate_type`, `validate_range`, `error_msg`, `error_msg_en`, `trigger`) values('59','OFT','["true","false"]','请填写是否复用引擎,false:不复用,true:复用','Please fill in whether or not to reuse engine, true: reuse, false: not reuse','blur'); + +DELETE FROM dss_appconn; +INSERT INTO `dss_appconn` (`id`, `appconn_name`, `is_user_need_init`, `level`, `if_iframe`, `is_external`, `reference`, `class_name`, `appconn_class_path`, `resource`) VALUES (2, 'orchestrator-framework', 0, 1, NULL, 0, NULL, 'com.webank.wedatasphere.dss.appconn.orchestrator.DefaultOrchestratorFrameworkAppConn', 'DSS_INSTALL_HOME_VAL/dss-appconns/orchestrator-framework/lib', ''); +INSERT INTO `dss_appconn` (`id`, `appconn_name`, `is_user_need_init`, `level`, `if_iframe`, `is_external`, `reference`, `class_name`, `appconn_class_path`, `resource`) VALUES (3, 'workflow', 0, 1, NULL, 0, NULL, 'com.webank.wedatasphere.dss.appconn.workflow.DefaultWorkflowAppConn', 'DSS_INSTALL_HOME_VAL/dss-appconns/workflow/lib', ''); +INSERT INTO `dss_appconn` (`id`, `appconn_name`, `is_user_need_init`, `level`, `if_iframe`, `is_external`, `reference`, `class_name`, `appconn_class_path`, `resource`) VALUES (5, 'eventchecker', 0, 1, NULL, 0, NULL, 'com.webank.wedatasphere.dss.appconn.eventchecker.EventCheckerAppConn', 'DSS_INSTALL_HOME_VAL/dss-appconns/eventchecker/lib', NULL); +INSERT INTO `dss_appconn` (`id`, `appconn_name`, `is_user_need_init`, `level`, `if_iframe`, `is_external`, `reference`, `class_name`, `appconn_class_path`, `resource`) VALUES (6, 'datachecker', 0, 1, NULL, 0, NULL, 'com.webank.wedatapshere.dss.appconn.datachecker.DataCheckerAppConn', 'DSS_INSTALL_HOME_VAL/dss-appconns/datachecker/lib', NULL); +insert into `dss_appconn` (`id`, `appconn_name`, `is_user_need_init`, `level`, `if_iframe`, `is_external`, `reference`, `class_name`, `appconn_class_path`, `resource`) values('7','sendemail','0','1',NULL,'0',NULL,'com.webank.wedatasphere.dss.appconn.sendemail.SendEmailAppConn','DSS_INSTALL_HOME_VAL/dss-appconns/sendemail/lib',NULL); + + +select @dss_appconn_orchestratorId:=id from `dss_appconn` where `appconn_name` = 'orchestrator-framework'; +select @dss_appconn_workflowId:=id from `dss_appconn` where `appconn_name` = 'workflow'; + +select @dss_appconn_eventcheckerId:=id from `dss_appconn` where `appconn_name` = 'eventchecker'; +select @dss_appconn_datacheckerId:=id from `dss_appconn` where `appconn_name` = 'datachecker'; + +DELETE FROM dss_appconn_instance; +INSERT INTO `dss_appconn_instance` (`appconn_id`, `label`, `url`, `enhance_json`, `homepage_url`, `redirect_url`) VALUES (@dss_appconn_orchestratorId, 'DEV', 'http://ORCHESTRATOR_IP:ORCHESTRATOR_PORT/#/workspaceHome?workspaceId=104', '', 'http://ORCHESTRATOR_IP:ORCHESTRATOR_PORT/#/workspaceHome?workspaceId=104', 'http://ORCHESTRATOR_IP:ORCHESTRATOR_PORT/#/workspaceHome?workspaceId=104'); +INSERT INTO `dss_appconn_instance` (`appconn_id`, `label`, `url`, `enhance_json`, `homepage_url`, `redirect_url`) VALUES (@dss_appconn_workflowId, 'DEV', 'http://WORKFLOW_IP:WORKFLOW_PORT/#/workspaceHome?workspaceId=104', '', 'http://WORKFLOW_IP:WORKFLOW_PORT/#/workspaceHome?workspaceId=104', 'http://WORKFLOW_IP:WORKFLOW_PORT/#/workspaceHome?workspaceId=104'); +INSERT INTO `dss_appconn_instance` (`appconn_id`, `label`, `url`, `enhance_json`, `homepage_url`, `redirect_url`) VALUES (@dss_appconn_eventcheckerId, 'DEV', 'eventchecker', '{"msg.eventchecker.jdo.option.name": "msg","msg.eventchecker.jdo.option.url": "EVENTCHECKER_JDBC_URL","msg.eventchecker.jdo.option.username": "EVENTCHECKER_JDBC_USERNAME","msg.eventchecker.jdo.option.password": "EVENTCHECKER_JDBC_PASSWORD"}', NULL, NULL); +INSERT INTO `dss_appconn_instance` (`appconn_id`, `label`, `url`, `enhance_json`, `homepage_url`, `redirect_url`) VALUES (@dss_appconn_datacheckerId, 'DEV', 'datachecker', '{"job.datachecker.jdo.option.name":"job","job.datachecker.jdo.option.url":"DATACHECKER_JOB_JDBC_URL","job.datachecker.jdo.option.username":"DATACHECKER_JOB_JDBC_USERNAME","job.datachecker.jdo.option.password":"DATACHECKER_JOB_JDBC_PASSWORD","bdp.datachecker.jdo.option.name":"bdp","bdp.datachecker.jdo.option.url":"DATACHECKER_BDP_JDBC_URL","bdp.datachecker.jdo.option.username":"DATACHECKER_BDP_JDBC_USERNAME","bdp.datachecker.jdo.option.password":"DATACHECKER_BDP_JDBC_PASSWORD","bdp.datachecker.jdo.option.login.type":"base64","bdp.mask.url":"http://BDP_MASK_IP:BDP_MASK_PORT/api/v1/mask-status?","bdp.mask.app.id":"wtss","bdp.mask.app.token":"20a0ccdfc0"}', NULL, NULL); +insert into `dss_appconn_instance` (`id`, `appconn_id`, `label`, `url`, `enhance_json`, `homepage_url`, `redirect_url`) values('12','7','DEV','sendemail','{"email.host":"EMAIL_HOST","email.port":"EMAIL_PORT","email.username":"EMAIL_USERNAME","email.password":"EMAIL_PASSWORD","email.protocol":"EMAIL_PROTOCOL"}',NULL,NULL); + +DELETE FROM dss_component_role; +INSERT INTO `dss_component_role` (`workspace_id`, `component_id`, `role_id`, `priv`, `update_time`, `updateby`) VALUES('-1',5,'7','1',now(),'system'); +INSERT INTO `dss_component_role` (`workspace_id`, `component_id`, `role_id`, `priv`, `update_time`, `updateby`) VALUES('-1',5,'1','1',now(),'system'); +INSERT INTO `dss_component_role` (`workspace_id`, `component_id`, `role_id`, `priv`, `update_time`, `updateby`) VALUES('-1',5,'2','0',now(),'system'); +INSERT INTO `dss_component_role` (`workspace_id`, `component_id`, `role_id`, `priv`, `update_time`, `updateby`) VALUES('-1',5,'3','0',now(),'system'); +INSERT INTO `dss_component_role` (`workspace_id`, `component_id`, `role_id`, `priv`, `update_time`, `updateby`) VALUES('-1',5,'4','0',now(),'system'); +INSERT INTO `dss_component_role` (`workspace_id`, `component_id`, `role_id`, `priv`, `update_time`, `updateby`) VALUES('-1',5,'5','0',now(),'system'); +INSERT INTO `dss_component_role` (`workspace_id`, `component_id`, `role_id`, `priv`, `update_time`, `updateby`) VALUES('-1',5,'6','0',now(),'system'); + +INSERT INTO `dss_component_role` (`workspace_id`, `component_id`, `role_id`, `priv`, `update_time`, `updateby`) VALUES('-1',7,'7','1',now(),'system'); +INSERT INTO `dss_component_role` (`workspace_id`, `component_id`, `role_id`, `priv`, `update_time`, `updateby`) VALUES('-1',7,'1','1',now(),'system'); +INSERT INTO `dss_component_role` (`workspace_id`, `component_id`, `role_id`, `priv`, `update_time`, `updateby`) VALUES('-1',7,'2','0',now(),'system'); +INSERT INTO `dss_component_role` (`workspace_id`, `component_id`, `role_id`, `priv`, `update_time`, `updateby`) VALUES('-1',7,'3','0',now(),'system'); +INSERT INTO `dss_component_role` (`workspace_id`, `component_id`, `role_id`, `priv`, `update_time`, `updateby`) VALUES('-1',7,'4','0',now(),'system'); +INSERT INTO `dss_component_role` (`workspace_id`, `component_id`, `role_id`, `priv`, `update_time`, `updateby`) VALUES('-1',7,'5','0',now(),'system'); +INSERT INTO `dss_component_role` (`workspace_id`, `component_id`, `role_id`, `priv`, `update_time`, `updateby`) VALUES('-1',7,'6','0',now(),'system'); + +INSERT INTO `dss_component_role` (`workspace_id`, `component_id`, `role_id`, `priv`, `update_time`, `updateby`) VALUES('-1',10,'7','1',now(),'system'); +INSERT INTO `dss_component_role` (`workspace_id`, `component_id`, `role_id`, `priv`, `update_time`, `updateby`) VALUES('-1',10,'1','1',now(),'system'); +INSERT INTO `dss_component_role` (`workspace_id`, `component_id`, `role_id`, `priv`, `update_time`, `updateby`) VALUES('-1',10,'2','0',now(),'system'); +INSERT INTO `dss_component_role` (`workspace_id`, `component_id`, `role_id`, `priv`, `update_time`, `updateby`) VALUES('-1',10,'3','0',now(),'system'); +INSERT INTO `dss_component_role` (`workspace_id`, `component_id`, `role_id`, `priv`, `update_time`, `updateby`) VALUES('-1',10,'4','0',now(),'system'); +INSERT INTO `dss_component_role` (`workspace_id`, `component_id`, `role_id`, `priv`, `update_time`, `updateby`) VALUES('-1',10,'5','0',now(),'system'); +INSERT INTO `dss_component_role` (`workspace_id`, `component_id`, `role_id`, `priv`, `update_time`, `updateby`) VALUES('-1',10,'6','0',now(),'system'); + +INSERT INTO `dss_component_role` (`workspace_id`, `component_id`, `role_id`, `priv`, `update_time`, `updateby`) VALUES('-1','11','7','1',now(),'system'); +INSERT INTO `dss_component_role` (`workspace_id`, `component_id`, `role_id`, `priv`, `update_time`, `updateby`) VALUES('-1','11','1','1',now(),'system'); +INSERT INTO `dss_component_role` (`workspace_id`, `component_id`, `role_id`, `priv`, `update_time`, `updateby`) VALUES('-1','11','2','0',now(),'system'); +INSERT INTO `dss_component_role` (`workspace_id`, `component_id`, `role_id`, `priv`, `update_time`, `updateby`) VALUES('-1','11','3','0',now(),'system'); +INSERT INTO `dss_component_role` (`workspace_id`, `component_id`, `role_id`, `priv`, `update_time`, `updateby`) VALUES('-1','11','4','0',now(),'system'); +INSERT INTO `dss_component_role` (`workspace_id`, `component_id`, `role_id`, `priv`, `update_time`, `updateby`) VALUES('-1','11','5','0',now(),'system'); +INSERT INTO `dss_component_role` (`workspace_id`, `component_id`, `role_id`, `priv`, `update_time`, `updateby`) VALUES('-1','11','6','0',now(),'system'); + +INSERT INTO `dss_component_role` (`workspace_id`, `component_id`, `role_id`, `priv`, `update_time`, `updateby`) VALUES('-1',14,'7','1',now(),'system'); +INSERT INTO `dss_component_role` (`workspace_id`, `component_id`, `role_id`, `priv`, `update_time`, `updateby`) VALUES('-1',14,'1','1',now(),'system'); +INSERT INTO `dss_component_role` (`workspace_id`, `component_id`, `role_id`, `priv`, `update_time`, `updateby`) VALUES('-1',14,'2','0',now(),'system'); +INSERT INTO `dss_component_role` (`workspace_id`, `component_id`, `role_id`, `priv`, `update_time`, `updateby`) VALUES('-1',14,'3','0',now(),'system'); +INSERT INTO `dss_component_role` (`workspace_id`, `component_id`, `role_id`, `priv`, `update_time`, `updateby`) VALUES('-1',14,'4','0',now(),'system'); +INSERT INTO `dss_component_role` (`workspace_id`, `component_id`, `role_id`, `priv`, `update_time`, `updateby`) VALUES('-1',14,'5','0',now(),'system'); +INSERT INTO `dss_component_role` (`workspace_id`, `component_id`, `role_id`, `priv`, `update_time`, `updateby`) VALUES('-1',14,'6','0',now(),'system'); + +INSERT INTO `dss_component_role` (`workspace_id`, `component_id`, `role_id`, `priv`, `update_time`, `updateby`) VALUES('-1',16,'7','1',now(),'system'); +INSERT INTO `dss_component_role` (`workspace_id`, `component_id`, `role_id`, `priv`, `update_time`, `updateby`) VALUES('-1',16,'1','1',now(),'system'); +INSERT INTO `dss_component_role` (`workspace_id`, `component_id`, `role_id`, `priv`, `update_time`, `updateby`) VALUES('-1',16,'2','0',now(),'system'); +INSERT INTO `dss_component_role` (`workspace_id`, `component_id`, `role_id`, `priv`, `update_time`, `updateby`) VALUES('-1',16,'3','0',now(),'system'); +INSERT INTO `dss_component_role` (`workspace_id`, `component_id`, `role_id`, `priv`, `update_time`, `updateby`) VALUES('-1',16,'4','0',now(),'system'); +INSERT INTO `dss_component_role` (`workspace_id`, `component_id`, `role_id`, `priv`, `update_time`, `updateby`) VALUES('-1',16,'5','0',now(),'system'); +INSERT INTO `dss_component_role` (`workspace_id`, `component_id`, `role_id`, `priv`, `update_time`, `updateby`) VALUES('-1',16,'6','0',now(),'system'); + +INSERT INTO `dss_component_role` (`workspace_id`, `component_id`, `role_id`, `priv`, `update_time`, `updateby`) VALUES('-1',17,'7','1',now(),'system'); +INSERT INTO `dss_component_role` (`workspace_id`, `component_id`, `role_id`, `priv`, `update_time`, `updateby`) VALUES('-1',17,'1','1',now(),'system'); +INSERT INTO `dss_component_role` (`workspace_id`, `component_id`, `role_id`, `priv`, `update_time`, `updateby`) VALUES('-1',17,'2','0',now(),'system'); +INSERT INTO `dss_component_role` (`workspace_id`, `component_id`, `role_id`, `priv`, `update_time`, `updateby`) VALUES('-1',17,'3','0',now(),'system'); +INSERT INTO `dss_component_role` (`workspace_id`, `component_id`, `role_id`, `priv`, `update_time`, `updateby`) VALUES('-1',17,'4','0',now(),'system'); +INSERT INTO `dss_component_role` (`workspace_id`, `component_id`, `role_id`, `priv`, `update_time`, `updateby`) VALUES('-1',17,'5','0',now(),'system'); +INSERT INTO `dss_component_role` (`workspace_id`, `component_id`, `role_id`, `priv`, `update_time`, `updateby`) VALUES('-1',17,'6','0',now(),'system'); diff --git a/db/qualitis.sql b/db/qualitis.sql deleted file mode 100644 index 04fb15cfb..000000000 --- a/db/qualitis.sql +++ /dev/null @@ -1,3 +0,0 @@ -INSERT INTO `dss_application` (`id`, `name`, `url`, `is_user_need_init`, `level`, `user_init_url`, `exists_project_service`, `project_url`, `enhance_json`, `if_iframe`, `homepage_url`, `redirect_url`) VALUES (NULL, 'qualitis', 'http://QUALITIS_ADRESS_IP_2:QUALITIS_ADRESS_PORT', '0', '1', NULL, '1', 'http://QUALITIS_ADRESS_IP_2:QUALITIS_ADRESS_PORT/#/projects/list?id=${projectId}&flow=true', NULL, '1', 'http://QUALITIS_ADRESS_IP_2:QUALITIS_ADRESS_PORT/#/dashboard', 'http://QUALITIS_ADRESS_IP_2:QUALITIS_ADRESS_PORT/qualitis/api/v1/redirect'); -SELECT @qualitis_appid:=id from dss_application WHERE `name` = 'qualitis'; -INSERT INTO `dss_workflow_node` (`id`, `icon`, `node_type`, `application_id`, `submit_to_scheduler`, `enable_copy`, `should_creation_before_node`, `support_jump`, `jump_url`) VALUES (NULL, NULL, 'linkis.appjoint.qualitis', @qualitis_appid, NULL, '1', '0', '1', 'http://QUALITIS_ADRESS_IP_2:QUALITIS_ADRESS_PORT/#/addGroupTechniqueRule?tableType=1&id=${projectId}&ruleGroupId=${ruleGroupId}&nodeId=${nodeId}'); \ No newline at end of file diff --git a/db/visualis.sql b/db/visualis.sql deleted file mode 100644 index 71070c7bd..000000000 --- a/db/visualis.sql +++ /dev/null @@ -1,34 +0,0 @@ -INSERT INTO `dss_application` (`id`, `name`, `url`, `is_user_need_init`, `level`, `user_init_url`, `exists_project_service`, `project_url`, `enhance_json`, `if_iframe`, `homepage_url`, `redirect_url`) VALUES (NULL, 'visualis', null, '0', '1', NULL, '0', NULL, NULL, '1', NULL, NULL); -SELECT @visualis_appid:=id from dss_application WHERE `name` = 'visualis'; -INSERT INTO `dss_workflow_node` (`id`, `icon`, `node_type`, `application_id`, `submit_to_scheduler`, `enable_copy`, `should_creation_before_node`, `support_jump`, `jump_url`) VALUES (NULL, NULL, 'linkis.appjoint.visualis.display', @visualis_appid, '1', '1', '1', '1', NULL); -INSERT INTO `dss_workflow_node` (`id`, `icon`, `node_type`, `application_id`, `submit_to_scheduler`, `enable_copy`, `should_creation_before_node`, `support_jump`, `jump_url`) VALUES (NULL, NULL, 'linkis.appjoint.visualis.dashboard', @visualis_appid, '1', '1', '1', '1', NULL);UPDATE `dss_application` SET url = 'http://VISUALIS_NGINX_IP_2:VISUALIS_NGINX_PORT' WHERE `name` in('visualis'); -UPDATE `dss_application` SET url = 'http://VISUALIS_NGINX_IP_2:VISUALIS_NGINX_PORT' WHERE `name` in('visualis'); -UPDATE `dss_application` SET project_url = 'http://VISUALIS_NGINX_IP_2:VISUALIS_NGINX_PORT/dss/visualis/#/project/${projectId}',homepage_url = 'http://VISUALIS_NGINX_IP_2:VISUALIS_NGINX_PORT/dss/visualis/#/projects' WHERE `name` in('visualis'); -UPDATE `dss_workflow_node` SET jump_url = 'http://VISUALIS_NGINX_IP_2:VISUALIS_NGINX_PORT/dss/visualis/#/project/${projectId}/display/${nodeId}' where node_type = 'linkis.appjoint.visualis.display'; -UPDATE `dss_workflow_node` SET jump_url = 'http://VISUALIS_NGINX_IP_2:VISUALIS_NGINX_PORT/dss/visualis/#/project/${projectId}/portal/${nodeId}/portalName/${nodeName}' where node_type = 'linkis.appjoint.visualis.dashboard'; -INSERT INTO `linkis_application` (`id`, `name`, `chinese_name`, `description`) VALUES (NULL, 'visualis', NULL, NULL); -select @application_id:=id from `linkis_application` where `name` = 'visualis'; - - - - - - - - - - - - - - - - - - - - - - - - diff --git a/dss-appconn/appconns/dss-datachecker-appconn/pom.xml b/dss-appconn/appconns/dss-datachecker-appconn/pom.xml new file mode 100644 index 000000000..c970ff2b4 --- /dev/null +++ b/dss-appconn/appconns/dss-datachecker-appconn/pom.xml @@ -0,0 +1,158 @@ + + + + + + dss + com.webank.wedatasphere.dss + 1.0.0 + ../../../pom.xml + + 4.0.0 + + dss-datachecker-appconn + + + + + + com.webank.wedatasphere.dss + dss-appconn-core + ${dss.version} + + + linkis-common + com.webank.wedatasphere.linkis + + + json4s-jackson_2.11 + org.json4s + + + + + + + com.webank.wedatasphere.dss + dss-development-process-standard + ${dss.version} + + + + com.webank.wedatasphere.dss + dss-development-process-standard-execution + ${dss.version} + + + + log4j + log4j + 1.2.17 + + + + com.squareup.okhttp3 + okhttp + 4.2.2 + + + + org.apache.commons + commons-lang3 + 3.4 + + + + com.alibaba + druid + 1.0.28 + + + com.webank.wedatasphere.dss + dss-common + ${dss.version} + provided + + + + + + + + org.apache.maven.plugins + maven-deploy-plugin + + + + net.alchim31.maven + scala-maven-plugin + + + org.apache.maven.plugins + maven-jar-plugin + + + org.apache.maven.plugins + maven-assembly-plugin + 2.3 + false + + + make-assembly + package + + single + + + + src/main/assembly/distribution.xml + + + + + + false + out + false + false + + src/main/assembly/distribution.xml + + + + + + + src/main/java + + **/*.xml + + + + src/main/resources + + **/*.properties + **/application.yml + **/bootstrap.yml + **/log4j2.xml + + + + + \ No newline at end of file diff --git a/dss-appconn/appconns/dss-datachecker-appconn/src/main/assembly/distribution.xml b/dss-appconn/appconns/dss-datachecker-appconn/src/main/assembly/distribution.xml new file mode 100644 index 000000000..3863a4afe --- /dev/null +++ b/dss-appconn/appconns/dss-datachecker-appconn/src/main/assembly/distribution.xml @@ -0,0 +1,66 @@ + + + + dss-datachecker-appconn + + dir + + true + datachecker + + + + + + lib + true + true + false + true + true + + + + + + ${basedir}/src/main/resources + + appconn.properties + + 0777 + / + unix + + + + ${basedir}/src/main/resources + + log4j.properties + log4j2.xml + + 0777 + conf + unix + + + + + + diff --git a/dss-appconn/appconns/dss-datachecker-appconn/src/main/java/com/webank/wedatapshere/dss/appconn/datachecker/DataChecker.java b/dss-appconn/appconns/dss-datachecker-appconn/src/main/java/com/webank/wedatapshere/dss/appconn/datachecker/DataChecker.java new file mode 100644 index 000000000..dfd2ee63d --- /dev/null +++ b/dss-appconn/appconns/dss-datachecker-appconn/src/main/java/com/webank/wedatapshere/dss/appconn/datachecker/DataChecker.java @@ -0,0 +1,93 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatapshere.dss.appconn.datachecker; + + +import com.webank.wedatapshere.dss.appconn.datachecker.connector.DataCheckerDao; +import com.webank.wedatasphere.dss.standard.app.development.listener.common.RefExecutionAction; +import com.webank.wedatasphere.dss.standard.app.development.listener.common.RefExecutionState; +import org.apache.log4j.Logger; + +import java.util.Properties; + +public class DataChecker { + public final static String SOURCE_TYPE = "source.type"; + public final static String DATA_OBJECT = "check.object"; + public final static String WAIT_TIME = "max.check.hours"; + public final static String QUERY_FREQUENCY = "query.frequency"; + public final static String TIME_SCAPE = "time.scape"; + public final static String MASK_URL = "bdp.mask.url"; + public final static String MASK_APP_ID = "bdp.mask.app.id"; + public final static String MASK_APP_TOKEN = "bdp.mask.app.token"; + + private Properties p; + private static final Logger logger = Logger.getRootLogger(); + DataCheckerDao wbDao = DataCheckerDao.getInstance(); + DataCheckerExecutionAction dataCheckerAction = null; + public long maxWaitTime; + public int queryFrequency; + + public DataChecker(Properties p, DataCheckerExecutionAction action) { + this.p = p; + dataCheckerAction = action; + maxWaitTime = Long.valueOf(p.getProperty(DataChecker.WAIT_TIME, "1")) * 3600 * 1000; + //test over time +// maxWaitTime = Long.valueOf(p.getProperty(DataChecker.WAIT_TIME, "1")) * 120 * 1000; + queryFrequency = Integer.valueOf(p.getProperty(DataChecker.QUERY_FREQUENCY, "30000")); + + } + + public void run() { + dataCheckerAction.setState(RefExecutionState.Running); + try { + if(p == null) { + throw new RuntimeException("Properties is null. Can't continue"); + } + if (!p.containsKey(SOURCE_TYPE)) { + logger.info("Properties " + SOURCE_TYPE + " value is Null !"); + } + if (!p.containsKey(DATA_OBJECT)) { + logger.info("Properties " + DATA_OBJECT + " value is Null !"); + } + begineCheck(dataCheckerAction); + }catch (Exception ex){ + dataCheckerAction.setState(RefExecutionState.Failed); + throw new RuntimeException("get DataChecker result failed", ex); + } + + } + + public void begineCheck(RefExecutionAction action){ + boolean success=false; + try { + success= wbDao.validateTableStatusFunction(p, logger,action); + }catch (Exception ex){ + dataCheckerAction.setState(RefExecutionState.Failed); + logger.error("datacheck error",ex); + throw new RuntimeException("get DataChecker result failed", ex); + } + if(success) { + dataCheckerAction.setState(RefExecutionState.Success); + }else { + dataCheckerAction.setState(RefExecutionState.Running); + } + } + + public void cancel() { + } + +} \ No newline at end of file diff --git a/dss-appconn/appconns/dss-datachecker-appconn/src/main/java/com/webank/wedatapshere/dss/appconn/datachecker/DataCheckerAppConn.java b/dss-appconn/appconns/dss-datachecker-appconn/src/main/java/com/webank/wedatapshere/dss/appconn/datachecker/DataCheckerAppConn.java new file mode 100644 index 000000000..61d48e28c --- /dev/null +++ b/dss-appconn/appconns/dss-datachecker-appconn/src/main/java/com/webank/wedatapshere/dss/appconn/datachecker/DataCheckerAppConn.java @@ -0,0 +1,37 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatapshere.dss.appconn.datachecker; + +import com.webank.wedatapshere.dss.appconn.datachecker.standard.DataCheckerDevelopmentStandard; +import com.webank.wedatasphere.dss.appconn.core.ext.OnlyDevelopmentAppConn; +import com.webank.wedatasphere.dss.appconn.core.impl.AbstractAppConn; +import com.webank.wedatasphere.dss.standard.app.development.standard.DevelopmentIntegrationStandard; + +public class DataCheckerAppConn extends AbstractAppConn implements OnlyDevelopmentAppConn { + + private DataCheckerDevelopmentStandard standard; + + @Override + protected void initialize() { + standard = new DataCheckerDevelopmentStandard(); + } + + @Override + public DevelopmentIntegrationStandard getOrCreateDevelopmentStandard() { + return standard; + } +} diff --git a/dss-appconn/appconns/dss-datachecker-appconn/src/main/java/com/webank/wedatapshere/dss/appconn/datachecker/DataCheckerCompletedExecutionResponseRef.java b/dss-appconn/appconns/dss-datachecker-appconn/src/main/java/com/webank/wedatapshere/dss/appconn/datachecker/DataCheckerCompletedExecutionResponseRef.java new file mode 100644 index 000000000..88c6bc619 --- /dev/null +++ b/dss-appconn/appconns/dss-datachecker-appconn/src/main/java/com/webank/wedatapshere/dss/appconn/datachecker/DataCheckerCompletedExecutionResponseRef.java @@ -0,0 +1,50 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatapshere.dss.appconn.datachecker; + +import com.webank.wedatasphere.dss.standard.app.development.listener.common.CompletedExecutionResponseRef; + +import java.util.Map; + +public class DataCheckerCompletedExecutionResponseRef extends CompletedExecutionResponseRef { + + private Exception exception; + public void setException(Exception exception) { + this.exception = exception; + } + + + + public DataCheckerCompletedExecutionResponseRef(int status) { + super(status); + } + + public DataCheckerCompletedExecutionResponseRef(String responseBody, int status) { + super(responseBody, status); + } + + public void setStatus(int status){ + this.status = status; + } + + + @Override + public Map toMap() { + return null; + } + +} diff --git a/dss-appconn/appconns/dss-datachecker-appconn/src/main/java/com/webank/wedatapshere/dss/appconn/datachecker/common/MaskCheckNotExistException.java b/dss-appconn/appconns/dss-datachecker-appconn/src/main/java/com/webank/wedatapshere/dss/appconn/datachecker/common/MaskCheckNotExistException.java new file mode 100644 index 000000000..5f1443674 --- /dev/null +++ b/dss-appconn/appconns/dss-datachecker-appconn/src/main/java/com/webank/wedatapshere/dss/appconn/datachecker/common/MaskCheckNotExistException.java @@ -0,0 +1,24 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatapshere.dss.appconn.datachecker.common; + +public class MaskCheckNotExistException extends Exception { + + public MaskCheckNotExistException(final String message) { + super(message); + } +} diff --git a/dss-appconn/appconns/dss-datachecker-appconn/src/main/java/com/webank/wedatapshere/dss/appconn/datachecker/connector/DataCheckerDao.java b/dss-appconn/appconns/dss-datachecker-appconn/src/main/java/com/webank/wedatapshere/dss/appconn/datachecker/connector/DataCheckerDao.java new file mode 100644 index 000000000..6ff9387d7 --- /dev/null +++ b/dss-appconn/appconns/dss-datachecker-appconn/src/main/java/com/webank/wedatapshere/dss/appconn/datachecker/connector/DataCheckerDao.java @@ -0,0 +1,367 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatapshere.dss.appconn.datachecker.connector; + +import com.alibaba.druid.pool.DruidDataSource; + +import com.webank.wedatapshere.dss.appconn.datachecker.DataChecker; +import com.webank.wedatapshere.dss.appconn.datachecker.common.MaskCheckNotExistException; +import com.webank.wedatapshere.dss.appconn.datachecker.utils.HttpUtils; +import com.webank.wedatasphere.dss.standard.app.development.listener.common.RefExecutionAction; +import okhttp3.FormBody; +import okhttp3.RequestBody; +import okhttp3.Response; +import okhttp3.ResponseBody; +import org.apache.commons.lang.StringUtils; +import org.apache.log4j.Logger; + +import javax.sql.DataSource; +import java.io.IOException; +import java.sql.Connection; +import java.sql.PreparedStatement; +import java.sql.ResultSet; +import java.sql.SQLException; +import java.util.HashMap; +import java.util.List; +import java.util.Map; +import java.util.Properties; +import java.util.function.Predicate; +import java.util.function.Supplier; +import java.util.regex.Matcher; +import java.util.regex.Pattern; +import java.util.stream.Collectors; + +public class DataCheckerDao { + + private static final String SQL_SOURCE_TYPE_JOB_TABLE = + "SELECT * FROM DBS d JOIN TBLS t ON t.DB_ID = d.DB_ID WHERE d.NAME=? AND t.TBL_NAME=?"; + + private static final String SQL_SOURCE_TYPE_JOB_PARTITION = + "SELECT * FROM DBS d JOIN TBLS t ON t.DB_ID = d.DB_ID JOIN PARTITIONS p ON p.TBL_ID = t.TBL_ID WHERE d.NAME=? AND t.TBL_NAME=? AND p.PART_NAME=?"; + + private static final String SQL_SOURCE_TYPE_BDP = + "SELECT * FROM desktop_bdapimport WHERE bdap_db_name = ? AND bdap_table_name = ? AND target_partition_name = ? AND status = '1';"; + + private static final String SQL_SOURCE_TYPE_BDP_WITH_TIME_CONDITION = + "SELECT * FROM desktop_bdapimport WHERE bdap_db_name = ? AND bdap_table_name = ? AND target_partition_name = ? " + + "AND (UNIX_TIMESTAMP() - UNIX_TIMESTAMP(STR_TO_DATE(modify_time, '%Y-%m-%d %H:%i:%s'))) <= ? AND status = '1';"; + + private static DataSource jobDS; + private static DataSource bdpDS; + private static DataCheckerDao instance; + + public static DataCheckerDao getInstance() { + if (instance == null) { + synchronized (DataCheckerDao.class) { + if (instance == null) { + instance = new DataCheckerDao(); + } + } + } + return instance; + } + + public boolean validateTableStatusFunction(Properties props, Logger log, RefExecutionAction action) { + if (jobDS == null) { + jobDS = DataDruidFactory.getJobInstance(props, log); + if (jobDS == null) { + log.error("Error getting Druid DataSource instance"); + return false; + } + } + if (bdpDS == null) { + bdpDS = DataDruidFactory.getBDPInstance(props, log); + if (bdpDS == null) { + log.warn("Error getting Druid DataSource instance"); + return false; + } + } + removeBlankSpace(props); + log.info("=============================Data Check Start=========================================="); + String dataCheckerInfo = props.getProperty(DataChecker.DATA_OBJECT); + if(null!=action.getExecutionRequestRefContext()) { + action.getExecutionRequestRefContext().appendLog("Database table partition info : " + dataCheckerInfo); + } + log.info("(DataChecker info) database table partition info : " + dataCheckerInfo); + long waitTime = Long.valueOf(props.getProperty(DataChecker.WAIT_TIME, "1")) * 3600 * 1000; + int queryFrequency = Integer.valueOf(props.getProperty(DataChecker.QUERY_FREQUENCY, "30000")); + log.info("(DataChecker info) wait time : " + waitTime); + log.info("(DataChecker info) query frequency : " + queryFrequency); + List> dataObjectList = extractProperties(props); + try (Connection jobConn = jobDS.getConnection(); + Connection bdpConn = bdpDS.getConnection()) { + boolean flag = dataObjectList + .stream() + .allMatch(proObjectMap -> getDataCheckResult(proObjectMap, jobConn, bdpConn, props, log)); + if (flag) { + log.info("=============================Data Check End=========================================="); + return true; + } + + } catch (SQLException e) { + throw new RuntimeException("get DataChecker result failed", e); + } + + log.info("=============================Data Check End=========================================="); + return false; + } + + private boolean getDataCheckResult(Map proObjectMap, Connection jobConn, Connection bdpConn, Properties props, Logger log) { + Predicate> hasDataSource = p -> { + if (StringUtils.isEmpty(proObjectMap.get(DataChecker.SOURCE_TYPE))) { + return false; + } else { + return true; + } + }; + Predicate> hasNotDataSource = hasDataSource.negate(); + Supplier sourceType = () -> proObjectMap.get(DataChecker.SOURCE_TYPE).toLowerCase(); + Predicate> isJobDataSource = p -> sourceType.get().equals("hivedb") || sourceType.get().equals("job"); + Predicate> isBdpDataSource = p -> sourceType.get().equals("maskdb") || sourceType.get().equals("bdp"); + Predicate> isOdsDB = p -> { + String dataObject = proObjectMap.get(DataChecker.DATA_OBJECT) + .replace(" ", "").trim(); + String dbName = dataObject.split("\\.")[0]; + return dbName.contains("_ods"); + }; + Predicate> isNotOdsDB = isOdsDB.negate(); + Predicate> isCheckMetadata = (hasDataSource.and(isJobDataSource)).or(hasNotDataSource.and(isNotOdsDB)); + Predicate> isCheckMask = (hasDataSource.and(isBdpDataSource)).or(hasNotDataSource.and(isOdsDB)); + return isCheckMetadata.test(proObjectMap) + ? getJobTotalCount(proObjectMap, jobConn, log) > 0 + : isCheckMask.test(proObjectMap) && + (getBdpTotalCount(proObjectMap, bdpConn, log, props) > 0 + || "success".equals(fetchMaskCode(proObjectMap, log, props).get("maskStatus"))); + } + + private void sleep(long sleepTime) { + try { + Thread.sleep(sleepTime); + } catch (InterruptedException e) { + e.printStackTrace(); + } + } + + private void removeBlankSpace(Properties props) { + try { + props.entrySet().forEach(entry -> { + String value = entry.getValue().toString().replaceAll(" ", "").trim(); + entry.setValue(value); + }); + } catch (Exception e) { + throw new RuntimeException("remove job space char failed", e); + } + } + + private List> extractProperties(Properties p) { + return p.keySet().stream() + .map(key -> key2Map(key, p)).filter(x -> x.size() > 0) + .collect(Collectors.toList()); + } + + private Map key2Map(Object key, Properties p) { + Map proMap = new HashMap<>(); + String skey = String.valueOf(key); + if (skey.contains(DataChecker.DATA_OBJECT)) { + String[] keyArr = skey.split("\\."); + if (keyArr.length == 3) { + String keyNum = keyArr[2]; + String stKey = DataChecker.SOURCE_TYPE + "." + keyNum; + String doKey = DataChecker.DATA_OBJECT + "." + keyNum; + if (null != p.get(stKey)) { + proMap.put(DataChecker.SOURCE_TYPE, String.valueOf(p.get(stKey))); + } + proMap.put(DataChecker.DATA_OBJECT, String.valueOf(p.get(doKey))); + } else { + String stKey = DataChecker.SOURCE_TYPE; + String doKey = DataChecker.DATA_OBJECT; + if (null != p.get(stKey)) { + proMap.put(DataChecker.SOURCE_TYPE, String.valueOf(p.get(stKey))); + } + proMap.put(DataChecker.DATA_OBJECT, String.valueOf(p.get(doKey))); + } + } + + return proMap; + } + + private PreparedStatement getJobStatement(Connection conn, String dataObject) throws SQLException { + String dataScape = dataObject.contains("{") ? "Partition" : "Table"; + String[] dataObjectArray = dataObject.split("\\."); + String dbName = dataObject.split("\\.")[0]; + String tableName = dataObject.split("\\.")[1]; + if (dataScape.equals("Partition")) { + Pattern pattern = Pattern.compile("\\{([^\\}]+)\\}"); + Matcher matcher = pattern.matcher(dataObject); + String partitionName = null; + if (matcher.find()) { + partitionName = matcher.group(1); + } + partitionName = partitionName.replace("\'", "").replace("\"", ""); + tableName = tableName.split("\\{")[0]; + PreparedStatement pstmt = conn.prepareCall(SQL_SOURCE_TYPE_JOB_PARTITION); + pstmt.setString(1, dbName); + pstmt.setString(2, tableName); + pstmt.setString(3, partitionName); + return pstmt; + } else if (dataObjectArray.length == 2) { + PreparedStatement pstmt = conn.prepareCall(SQL_SOURCE_TYPE_JOB_TABLE); + pstmt.setString(1, dbName); + pstmt.setString(2, tableName); + return pstmt; + } else { + throw new SQLException("Error for DataObject format!"); + } + } + + private PreparedStatement getBdpStatement(Connection conn, String dataObject, String timeScape) throws SQLException { + String dataScape = dataObject.contains("{") ? "Partition" : "Table"; + String dbName = dataObject.split("\\.")[0]; + String tableName = dataObject.split("\\.")[1]; + String partitionName = ""; + Pattern pattern = Pattern.compile("\\{([^\\}]+)\\}"); + if (dataScape.equals("Partition")) { + Matcher matcher = pattern.matcher(dataObject); + if (matcher.find()) { + partitionName = matcher.group(1); + } + partitionName = partitionName.replace("\'", "").replace("\"", ""); + tableName = tableName.split("\\{")[0]; + } + PreparedStatement pstmt = null; + if (timeScape.equals("NULL")) { + pstmt = conn.prepareCall(SQL_SOURCE_TYPE_BDP); + } else { + pstmt = conn.prepareCall(SQL_SOURCE_TYPE_BDP_WITH_TIME_CONDITION); + pstmt.setInt(4, Integer.valueOf(timeScape) * 3600); + } + pstmt.setString(1, dbName); + pstmt.setString(2, tableName); + pstmt.setString(3, partitionName); + return pstmt; + } + + private long getJobTotalCount(Map proObjectMap, Connection conn, Logger log) { + String dataObject = proObjectMap.get(DataChecker.DATA_OBJECT); + if (dataObject != null) { + dataObject = dataObject.replace(" ", "").trim(); + } + log.info("-------------------------------------- search hive/spark/mr data "); + log.info("-------------------------------------- : " + dataObject); + try (PreparedStatement pstmt = getJobStatement(conn, dataObject)) { + ResultSet rs = pstmt.executeQuery(); + return rs.last() ? rs.getRow() : 0; + } catch (SQLException e) { + log.error("fetch data from Hive MetaStore error", e); + return 0; + } + } + + private long getBdpTotalCount(Map proObjectMap, Connection conn, Logger log, Properties props) { + String dataObject = proObjectMap.get(DataChecker.DATA_OBJECT); + if (dataObject != null) { + dataObject = dataObject.replace(" ", "").trim(); + } + String timeScape = props.getOrDefault(DataChecker.TIME_SCAPE, "NULL").toString(); + log.info("-------------------------------------- search bdp data "); + log.info("-------------------------------------- : " + dataObject); + try (PreparedStatement pstmt = getBdpStatement(conn, dataObject, timeScape)) { + ResultSet rs = pstmt.executeQuery(); + return rs.last() ? rs.getRow() : 0; + } catch (SQLException e) { + log.error("fetch data from Hive MetaStore error", e); + return 0; + } + } + + private Map fetchMaskCode(Map proObjectMap, Logger log, Properties props) { + log.info("=============================调用BDP MASK接口查询数据状态=========================================="); + Map resultMap = new HashMap(); + String maskUrl = props.getProperty(DataChecker.MASK_URL); + String dataObject = proObjectMap.get(DataChecker.DATA_OBJECT); + if (dataObject != null) { + dataObject = dataObject.replace(" ", "").trim(); + } + String dataScape = dataObject.contains("{") ? "Partition" : "Table"; + String dbName = dataObject.split("\\.")[0]; + String tableName = dataObject.split("\\.")[1]; + String partitionName = ""; + Pattern pattern = Pattern.compile("\\{([^\\}]+)\\}"); + if (dataScape.equals("Partition")) { + Matcher matcher = pattern.matcher(dataObject); + if (matcher.find()) { + partitionName = matcher.group(1); + } + partitionName = partitionName.replace("\'", "").replace("\"", ""); + tableName = tableName.split("\\{")[0]; + } + try { + RequestBody requestBody = new FormBody.Builder() + .add("targetDb", dbName) + .add("targetTable", tableName) + .add("partition", partitionName) + .build(); + Map dataMap = HttpUtils.initSelectParams(props); + log.info("request body:dbName--" + dbName + " tableName--" + tableName + " partitionName--" + partitionName); + Response response = HttpUtils.httpClientHandleBase(maskUrl, requestBody, dataMap); + handleResponse(response, resultMap, log); + } catch (IOException e) { + log.error("fetch data from BDP MASK failed "); + resultMap.put("maskStatus", "noPrepare"); + } catch (MaskCheckNotExistException e) { + String errorMessage = "fetch data from BDP MASK failed" + + "please check database: " + dbName + ",table: " + tableName + "is exist"; + log.error(errorMessage); + throw new RuntimeException(errorMessage, e); + } + return resultMap; + } + + private void handleResponse(Response response, Map proObjectMap, Logger log) + throws IOException, MaskCheckNotExistException { + int responseCode = response.code(); + ResponseBody body = response.body(); + if (responseCode == 200) { + handleResponseBody(body, proObjectMap, log); + } else { + proObjectMap.put("maskStatus", "noPrepare"); + } + } + + private void handleResponseBody(ResponseBody body, Map proObjectMap, Logger log) + throws IOException, MaskCheckNotExistException { + String bodyStr = body.string(); + log.info("mask interface response body:" + bodyStr); + Map entityMap = HttpUtils.getReturnMap(bodyStr); + String codeValue = (String) entityMap.get("code"); + if ("200".equals(codeValue)) { + proObjectMap.put("maskStatus", "success"); + } else if ("1011".equals(codeValue)) { + throw new MaskCheckNotExistException("Mask check failed"); + } else { + proObjectMap.put("maskStatus", "noPrepare"); + } + } + + public static void closeDruidDataSource() { + DruidDataSource jobDSObject = (DruidDataSource) jobDS; + if (jobDSObject != null) { + jobDSObject.close(); + } + } + +} diff --git a/dss-appconn/appconns/dss-datachecker-appconn/src/main/java/com/webank/wedatapshere/dss/appconn/datachecker/connector/DataDruidFactory.java b/dss-appconn/appconns/dss-datachecker-appconn/src/main/java/com/webank/wedatapshere/dss/appconn/datachecker/connector/DataDruidFactory.java new file mode 100644 index 000000000..ba7e9adba --- /dev/null +++ b/dss-appconn/appconns/dss-datachecker-appconn/src/main/java/com/webank/wedatapshere/dss/appconn/datachecker/connector/DataDruidFactory.java @@ -0,0 +1,157 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatapshere.dss.appconn.datachecker.connector; + +import com.alibaba.druid.pool.DruidDataSource; +import org.apache.commons.lang3.StringUtils; +import org.apache.log4j.Logger; + +import java.util.Base64; +import java.util.Properties; + +public class DataDruidFactory { + private static DruidDataSource jobInstance; + private static DruidDataSource bdpInstance; + private static DruidDataSource msgInstance; + + public static DruidDataSource getJobInstance(Properties props, Logger log) { + if (jobInstance == null ) { + synchronized (DataDruidFactory.class) { + if(jobInstance == null) { + try { + jobInstance = createDataSource(props, log, "Job"); + } catch (Exception e) { + throw new RuntimeException("Error creating Druid DataSource", e); + } + } + } + } + return jobInstance; + } + public static DruidDataSource getBDPInstance(Properties props, Logger log) { + if (bdpInstance == null ) { + synchronized (DataDruidFactory.class) { + if(bdpInstance == null) { + try { + bdpInstance = createDataSource(props, log, "BDP"); + } catch (Exception e) { + throw new RuntimeException("Error creating Druid DataSource", e); + } + } + } + } + return bdpInstance; + } + + public static DruidDataSource getMsgInstance(Properties props, Logger log) { + if (msgInstance == null ) { + synchronized (DataDruidFactory.class) { + if(msgInstance == null) { + try { + msgInstance = createDataSource(props, log, "Msg"); + } catch (Exception e) { + throw new RuntimeException("Error creating Druid DataSource", e); + } + } + } + } + return msgInstance; + } + + private static DruidDataSource createDataSource(Properties props, Logger log, String type) { + String name = null; + String url = null; + String username = null; + String password = null; + String loginType = null; + if (type.equals("Job")) { + name = props.getProperty("job.datachecker.jdo.option.name"); + url = props.getProperty("job.datachecker.jdo.option.url"); + username = props.getProperty("job.datachecker.jdo.option.username"); + password = props.getProperty("job.datachecker.jdo.option.password"); + loginType = props.getProperty("job.datachecker.jdo.option.login.type"); + log.info("job url is:"+url+"and name is:"+username); + try { + if("base64".equals(loginType)) { + password = new String(Base64.getDecoder().decode(props.getProperty("job.datachecker.jdo.option.password").getBytes()), "UTF-8"); + }else { + password = props.getProperty("job.datachecker.jdo.option.password"); + } + } catch (Exception e){ + log.error("password decore failed" + e); + } + }else if (type.equals("BDP")) { + name = props.getProperty("bdp.datachecker.jdo.option.name"); + url = props.getProperty("bdp.datachecker.jdo.option.url"); + username = props.getProperty("bdp.datachecker.jdo.option.username"); + password = props.getProperty("bdp.datachecker.jdo.option.password"); + loginType = props.getProperty("bdp.datachecker.jdo.option.login.type"); + log.info("bdp url is:"+url+"and name is:"+username); + try { + if("base64".equals(loginType)) { + password = new String(Base64.getDecoder().decode(props.getProperty("bdp.datachecker.jdo.option.password").getBytes()), "UTF-8"); + }else { + password = props.getProperty("bdp.datachecker.jdo.option.password"); + } + } catch (Exception e){ + log.error("password decore failed" + e); + } + } + int initialSize = Integer.valueOf(props.getProperty("datachecker.jdo.option.initial.size", "1")); + int maxActive = Integer.valueOf(props.getProperty("datachecker.jdo.option.max.active", "100")); + int minIdle = Integer.valueOf(props.getProperty("datachecker.jdo.option.min.idle", "1")); + long maxWait = Long.valueOf(props.getProperty("datachecker.jdo.option.max.wait", "60000")); + String validationQuery = props.getProperty("datachecker.jdo.option.validation.quert", "SELECT 'x'"); + long timeBetweenEvictionRunsMillis = Long.valueOf(props.getProperty("datachecker.jdo.option.time.between.eviction.runs.millis", "6000")); + long minEvictableIdleTimeMillis = Long.valueOf(props.getProperty("datachecker.jdo.option.evictable.idle,time.millis", "300000")); + boolean testOnBorrow = Boolean.valueOf(props.getProperty("datachecker.jdo.option.test.on.borrow", "true")); + int maxOpenPreparedStatements = Integer.valueOf(props.getProperty("datachecker.jdo.option.max.open.prepared.statements", "-1")); + + + if (timeBetweenEvictionRunsMillis > minEvictableIdleTimeMillis) { + timeBetweenEvictionRunsMillis = minEvictableIdleTimeMillis; + } + + DruidDataSource ds = new DruidDataSource(); + + if (StringUtils.isNotBlank(name)) { + ds.setName(name); + } + + ds.setUrl(url); + ds.setDriverClassName("com.mysql.jdbc.Driver"); + ds.setUsername(username); + ds.setPassword(password); + ds.setInitialSize(initialSize); + ds.setMinIdle(minIdle); + ds.setMaxActive(maxActive); + ds.setMaxWait(maxWait); + ds.setTestOnBorrow(testOnBorrow); + ds.setValidationQuery(validationQuery); + ds.setTimeBetweenEvictionRunsMillis(timeBetweenEvictionRunsMillis); + ds.setMinEvictableIdleTimeMillis(minEvictableIdleTimeMillis); + if (maxOpenPreparedStatements > 0) { + ds.setPoolPreparedStatements(true); + ds.setMaxPoolPreparedStatementPerConnectionSize( + maxOpenPreparedStatements); + } else { + ds.setPoolPreparedStatements(false); + } + log.info("Druid data source initialed!"); + return ds; + } +} diff --git a/dss-appconn/appconns/dss-datachecker-appconn/src/main/java/com/webank/wedatapshere/dss/appconn/datachecker/service/DataCheckerExecuteService.java b/dss-appconn/appconns/dss-datachecker-appconn/src/main/java/com/webank/wedatapshere/dss/appconn/datachecker/service/DataCheckerExecuteService.java new file mode 100644 index 000000000..3ab3e5674 --- /dev/null +++ b/dss-appconn/appconns/dss-datachecker-appconn/src/main/java/com/webank/wedatapshere/dss/appconn/datachecker/service/DataCheckerExecuteService.java @@ -0,0 +1,33 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatapshere.dss.appconn.datachecker.service; + +import com.webank.wedatapshere.dss.appconn.datachecker.DataCheckerRefExecutionOperation; +import com.webank.wedatasphere.dss.standard.app.development.operation.RefExecutionOperation; +import com.webank.wedatasphere.dss.standard.app.development.service.AbstractRefExecutionService; + +public class DataCheckerExecuteService extends AbstractRefExecutionService { + + + @Override + public RefExecutionOperation createRefExecutionOperation() { + DataCheckerRefExecutionOperation dataCheckerRefExecutionOperation = new DataCheckerRefExecutionOperation(); + dataCheckerRefExecutionOperation.setDevelopmentService(this); + return dataCheckerRefExecutionOperation; + } + +} diff --git a/dss-appconn/appconns/dss-datachecker-appconn/src/main/java/com/webank/wedatapshere/dss/appconn/datachecker/standard/DataCheckerDevelopmentStandard.java b/dss-appconn/appconns/dss-datachecker-appconn/src/main/java/com/webank/wedatapshere/dss/appconn/datachecker/standard/DataCheckerDevelopmentStandard.java new file mode 100644 index 000000000..ed7171e3d --- /dev/null +++ b/dss-appconn/appconns/dss-datachecker-appconn/src/main/java/com/webank/wedatapshere/dss/appconn/datachecker/standard/DataCheckerDevelopmentStandard.java @@ -0,0 +1,42 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatapshere.dss.appconn.datachecker.standard; + +import com.webank.wedatapshere.dss.appconn.datachecker.service.DataCheckerExecuteService; +import com.webank.wedatasphere.dss.standard.app.development.standard.OnlyExecutionDevelopmentStandard; +import com.webank.wedatasphere.dss.standard.app.development.service.RefExecutionService; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; + +public class DataCheckerDevelopmentStandard extends OnlyExecutionDevelopmentStandard { + + + private static final Logger LOGGER = LoggerFactory.getLogger(DataCheckerDevelopmentStandard.class); + + + @Override + protected RefExecutionService createRefExecutionService() { + return new DataCheckerExecuteService(); + } + + @Override + public void init() { + LOGGER.info("class DataCheckerDevelopmentStandard init"); + } + + +} diff --git a/dss-appconn/appconns/dss-datachecker-appconn/src/main/java/com/webank/wedatapshere/dss/appconn/datachecker/utils/HttpUtils.java b/dss-appconn/appconns/dss-datachecker-appconn/src/main/java/com/webank/wedatapshere/dss/appconn/datachecker/utils/HttpUtils.java new file mode 100644 index 000000000..34208869e --- /dev/null +++ b/dss-appconn/appconns/dss-datachecker-appconn/src/main/java/com/webank/wedatapshere/dss/appconn/datachecker/utils/HttpUtils.java @@ -0,0 +1,117 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatapshere.dss.appconn.datachecker.utils; + +import com.google.gson.Gson; +import com.google.gson.GsonBuilder; +import com.google.gson.reflect.TypeToken; + +import com.webank.wedatapshere.dss.appconn.datachecker.DataChecker; +import okhttp3.*; +import org.apache.commons.codec.digest.DigestUtils; +import org.apache.commons.lang3.RandomStringUtils; +import org.apache.log4j.Logger; + +import java.io.IOException; +import java.util.HashMap; +import java.util.Map; +import java.util.Properties; +import java.util.concurrent.TimeUnit; + +public class HttpUtils { + + private static final Logger logger = Logger.getLogger(HttpUtils.class); + + public static Response httpClientHandleBase(String actionUrl, RequestBody requestBody, Map urlMap) throws IOException { + String maskUrl = actionUrl + "appid=" + urlMap.get("appid") + "&&nonce=" + urlMap.get("nonce") + + "&×tamp=" + urlMap.get("timestamp") + "&&signature=" + urlMap.get("signature"); + OkHttpClient okHttpClient = new OkHttpClient.Builder() + .connectTimeout(10, TimeUnit.SECONDS) + .writeTimeout(20, TimeUnit.SECONDS) + .readTimeout(20, TimeUnit.SECONDS) + .build(); + + logger.info("access mask URL is:"+maskUrl); + Request request = new Request.Builder() + .url(maskUrl) + .post(requestBody) + .build(); + Call call = okHttpClient.newCall(request); + Response response = call.execute(); + logger.info("mask interface response code:" + response.code()); + return response; + } + + public static String httpClientHandle(String actionUrl, RequestBody requestBody, Map urlMap) throws Exception{ + String returnData = ""; + try { + Response response = httpClientHandleBase(actionUrl, requestBody, urlMap); + returnData = response.body().string(); + logger.info("mask interface return message:" + returnData); + } catch (IOException e) { + e.printStackTrace(); + } + return returnData; + } + + public static String httpClientHandle(String actionUrl) throws Exception{ + OkHttpClient okHttpClient = new OkHttpClient.Builder() + .connectTimeout(10, TimeUnit.SECONDS) + .writeTimeout(20, TimeUnit.SECONDS) + .readTimeout(20, TimeUnit.SECONDS) + .build(); + Request request = new Request.Builder() + .url(actionUrl) + .build(); + Call call = okHttpClient.newCall(request); + String returnData = ""; + try { + Response response = call.execute(); + returnData = response.body().string(); + logger.info("interface return message:" + returnData); + } catch (IOException e) { + e.printStackTrace(); + } + return returnData; + } + + public static Map getReturnMap(String dataStr){ + Map dataMap = new HashMap<>(); + GsonBuilder gb = new GsonBuilder(); + Gson g = gb.create(); + dataMap = g.fromJson(dataStr, new TypeToken>(){}.getType()); + return dataMap; + } + + public static Map initSelectParams(Properties props){ + String appid = props.getProperty(DataChecker.MASK_APP_ID); + String token = props.getProperty(DataChecker.MASK_APP_TOKEN); + String nonce = RandomStringUtils.random(5, "0123456789"); + Long cur_time = System.currentTimeMillis() / 1000; + Map requestProperties = new HashMap<>(); + requestProperties.put("appid", appid); + requestProperties.put("nonce", nonce.toString()); + requestProperties.put("signature", getMD5(getMD5(appid + nonce.toString() + cur_time) + token)); + requestProperties.put("timestamp", cur_time.toString()); + return requestProperties; + } + + public static String getMD5(String str){ + return DigestUtils.md5Hex(str.getBytes()); + } + +} diff --git a/dss-appconn/appconns/dss-datachecker-appconn/src/main/resources/appconn.properties b/dss-appconn/appconns/dss-datachecker-appconn/src/main/resources/appconn.properties new file mode 100644 index 000000000..ac52d9b55 --- /dev/null +++ b/dss-appconn/appconns/dss-datachecker-appconn/src/main/resources/appconn.properties @@ -0,0 +1,33 @@ +# +# Copyright 2019 WeBank +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# +# + +job.datachecker.jdo.option.name=job +job.datachecker.jdo.option.url=jdbc:mysql://127.0.0.1:3306/ +job.datachecker.jdo.option.username= +job.datachecker.jdo.option.password= +job.datachecker.jdo.option.login.type=base64 + +bdp.datachecker.jdo.option.name=bdp +bdp.datachecker.jdo.option.url= +bdp.datachecker.jdo.option.username= +bdp.datachecker.jdo.option.password= +bdp.datachecker.jdo.option.login.type=base64 + + +bdp.mask.url= +bdp.mask.app.id= +bdp.mask.app.token= + diff --git a/dss-appconn/appconns/dss-datachecker-appconn/src/main/resources/log4j.properties b/dss-appconn/appconns/dss-datachecker-appconn/src/main/resources/log4j.properties new file mode 100644 index 000000000..ee8619595 --- /dev/null +++ b/dss-appconn/appconns/dss-datachecker-appconn/src/main/resources/log4j.properties @@ -0,0 +1,36 @@ +# +# Copyright 2019 WeBank +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# +# + +### set log levels ### + +log4j.rootCategory=INFO,console + +log4j.appender.console=org.apache.log4j.ConsoleAppender +log4j.appender.console.Threshold=INFO +log4j.appender.console.layout=org.apache.log4j.PatternLayout +#log4j.appender.console.layout.ConversionPattern= %d{ISO8601} %-5p (%t) [%F:%M(%L)] - %m%n +log4j.appender.console.layout.ConversionPattern= %d{ISO8601} %-5p (%t) %p %c{1} - %m%n + + +log4j.appender.com.webank.bdp.ide.core=org.apache.log4j.DailyRollingFileAppender +log4j.appender.com.webank.bdp.ide.core.Threshold=INFO +log4j.additivity.com.webank.bdp.ide.core=false +log4j.appender.com.webank.bdp.ide.core.layout=org.apache.log4j.PatternLayout +log4j.appender.com.webank.bdp.ide.core.Append=true +log4j.appender.com.webank.bdp.ide.core.File=logs/linkis.log +log4j.appender.com.webank.bdp.ide.core.layout.ConversionPattern= %d{ISO8601} %-5p (%t) [%F:%M(%L)] - %m%n + +log4j.logger.org.springframework=INFO diff --git a/dss-appconn/appconns/dss-datachecker-appconn/src/main/resources/log4j2.xml b/dss-appconn/appconns/dss-datachecker-appconn/src/main/resources/log4j2.xml new file mode 100644 index 000000000..8c40a73e8 --- /dev/null +++ b/dss-appconn/appconns/dss-datachecker-appconn/src/main/resources/log4j2.xml @@ -0,0 +1,38 @@ + + + + + + + + + + + + + + + + + + + + + + + diff --git a/dss-appconn/appconns/dss-datachecker-appconn/src/main/scala/com/webank/wedatapshere/dss/appconn/datachecker/DataCheckerExecutionAction.scala b/dss-appconn/appconns/dss-datachecker-appconn/src/main/scala/com/webank/wedatapshere/dss/appconn/datachecker/DataCheckerExecutionAction.scala new file mode 100644 index 000000000..c99d9d635 --- /dev/null +++ b/dss-appconn/appconns/dss-datachecker-appconn/src/main/scala/com/webank/wedatapshere/dss/appconn/datachecker/DataCheckerExecutionAction.scala @@ -0,0 +1,42 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatapshere.dss.appconn.datachecker + + +import com.webank.wedatasphere.dss.standard.app.development.listener.common.{AbstractRefExecutionAction, LongTermRefExecutionAction, RefExecutionAction, RefExecutionState} + +class DataCheckerExecutionAction extends AbstractRefExecutionAction with LongTermRefExecutionAction{ + private[this] var _state: RefExecutionState = null + private var schedulerId: Int = _ + def state: RefExecutionState = _state + + def setState(value: RefExecutionState): Unit = { + _state = value + } + val response = new DataCheckerCompletedExecutionResponseRef(200) + private[this] var _dc: DataChecker = null + + def dc: DataChecker = _dc + + def setDc(value: DataChecker): Unit = { + _dc = value + } + + override def setSchedulerId(schedulerId: Int): Unit = this.schedulerId = schedulerId + + override def getSchedulerId: Int = schedulerId +} diff --git a/dss-appconn/appconns/dss-datachecker-appconn/src/main/scala/com/webank/wedatapshere/dss/appconn/datachecker/DataCheckerRefExecutionOperation.scala b/dss-appconn/appconns/dss-datachecker-appconn/src/main/scala/com/webank/wedatapshere/dss/appconn/datachecker/DataCheckerRefExecutionOperation.scala new file mode 100644 index 000000000..910623faa --- /dev/null +++ b/dss-appconn/appconns/dss-datachecker-appconn/src/main/scala/com/webank/wedatapshere/dss/appconn/datachecker/DataCheckerRefExecutionOperation.scala @@ -0,0 +1,190 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatapshere.dss.appconn.datachecker + +import java.util +import java.util.{Properties, UUID} + +import com.webank.wedatasphere.dss.standard.app.development.listener.common.{AsyncExecutionRequestRef, AsyncExecutionResponseRef, CompletedExecutionResponseRef, RefExecutionAction, RefExecutionState} +import com.webank.wedatasphere.dss.standard.app.development.listener.core.{Killable, LongTermRefExecutionOperation, Procedure} +import com.webank.wedatasphere.dss.standard.app.development.ref.ExecutionRequestRef +import com.webank.wedatasphere.dss.standard.app.development.service.DevelopmentService +import com.webank.wedatasphere.linkis.common.log.LogUtils +import com.webank.wedatasphere.linkis.common.utils.{Utils, VariableUtils} +import org.slf4j.LoggerFactory; + +class DataCheckerRefExecutionOperation extends LongTermRefExecutionOperation with Killable with Procedure{ + + private var service:DevelopmentService = _ + + private val logger = LoggerFactory.getLogger(classOf[DataCheckerRefExecutionOperation]) + + + + + + + protected def putErrorMsg(errorMsg: String, t: Throwable, action: DataCheckerExecutionAction): DataCheckerExecutionAction = t match { + + case t: Exception => + val response = action.response + response.setErrorMsg(errorMsg) + response.setException(t) + response.setIsSucceed(false) + action + } + + override def submit(requestRef: ExecutionRequestRef): RefExecutionAction = { + val asyncExecutionRequestRef = requestRef.asInstanceOf[AsyncExecutionRequestRef] + val nodeAction = new DataCheckerExecutionAction() + nodeAction.setId(UUID.randomUUID().toString()) + import scala.collection.JavaConversions.mapAsScalaMap + val InstanceConfig = this.service.getAppInstance.getConfig + val runTimeParams: scala.collection.mutable.Map[String, Object] = asyncExecutionRequestRef.getExecutionRequestRefContext().getRuntimeMap() + val variableParams: scala.collection.mutable.Map[String, Object]= asyncExecutionRequestRef.getJobContent.get("variable"). asInstanceOf[java.util.Map[String,Object]] + val inputParams =runTimeParams++variableParams + val properties = new Properties() + InstanceConfig.foreach { + case (key: String, value: Object) => + //避免密码被打印 + properties.put(key, value.toString) + } + val tmpProperties = new Properties() + runTimeParams.foreach( + record=> + if (null == record._2) { + properties.put(record._1, "") + }else { + if (record._1.equalsIgnoreCase("job.desc")) { + val rows = record._2.asInstanceOf[String].split("\n") + rows.foreach(row => if (row.contains("=")) { + val endLocation = row.indexOf("="); + val rowKey = row.substring(0, endLocation) + val rowEnd = row.substring(endLocation + 1) + tmpProperties.put(rowKey, rowEnd) + }) + } else { + tmpProperties.put(record._1, record._2) + } + } + ) + tmpProperties.foreach { record => + logger.info("request params key : " + record._1 + ",value : " + record._2) + if (null == record._2) { + properties.put(record._1, "") + } + else { + if(inputParams.exists(x=>x._1.equalsIgnoreCase(VariableUtils.RUN_DATE))) { + val tmp:util.HashMap[String, Any] = new util.HashMap[String,Any]() + tmp.put(VariableUtils.RUN_DATE,inputParams.get(VariableUtils.RUN_DATE).getOrElse(null)) + properties.put(record._1,VariableUtils.replace(record._2.toString,tmp)) + }else { + properties.put(record._1, VariableUtils.replace(record._2.toString)) + } + } + } + Utils.tryCatch({ + val dc = new DataChecker(properties, nodeAction) + dc.run() + nodeAction.setDc(dc) + })(t => { + logger.error("DataChecker run failed for " + t.getMessage, t) + putErrorMsg("DataChecker run failed! " + t.getMessage, t, nodeAction) + }) + nodeAction + + } + + override def state(action: RefExecutionAction): RefExecutionState = { + action match { + case action: DataCheckerExecutionAction => { + action.getExecutionRequestRefContext.appendLog("DataCheck is running!") + if (action.state.isCompleted) return action.state + Utils.tryCatch(action.dc.begineCheck(action))(t => { + action.setState(RefExecutionState.Failed) + logger.error("DataChecker run failed for " + t.getMessage, t) + putErrorMsg("DataChecker run failed! " + t.getMessage, t, action) + }) + action.state + } + case _ => RefExecutionState.Failed + } + } + + override def result(action: RefExecutionAction): CompletedExecutionResponseRef = { + val response:DataCheckerCompletedExecutionResponseRef = new DataCheckerCompletedExecutionResponseRef(200) + action match { + case action: DataCheckerExecutionAction => { + if (action.state.equals(RefExecutionState.Success)) { + response.setIsSucceed(true) + } else { + response.setErrorMsg(action.response.getErrorMsg) + response.setIsSucceed(false) + } + response + } + case _ => { + response.setIsSucceed(false) + response + } + + + } + } + + override def kill(action: RefExecutionAction): Boolean = action match { + case longTermAction: DataCheckerExecutionAction => + longTermAction.setKilledFlag(true) + longTermAction.setState(RefExecutionState.Killed) + true + } + + override def progress(action: RefExecutionAction): Float = { + //todo complete progress + 0.5f + } + + override def log(action: RefExecutionAction): String = { + action match { + case action: DataCheckerExecutionAction => { + if (!action.state.isCompleted) { + LogUtils.generateInfo("DataChecker is waiting for tables") + } else { + LogUtils.generateInfo("DataChecker successfully received info of tables") + } + } + case _ => LogUtils.generateERROR("Error for NodeExecutionAction ") + } + + } + + override def createAsyncResponseRef(requestRef: ExecutionRequestRef, action: RefExecutionAction): AsyncExecutionResponseRef = { + action match { + case action: DataCheckerExecutionAction => { + val response = super.createAsyncResponseRef(requestRef,action) + response.setMaxLoopTime(action.dc.maxWaitTime) + response.setAskStatePeriod(action.dc.queryFrequency) + response + } + } + } + + + override def setDevelopmentService(service: DevelopmentService): Unit = { + this.service = service + } +} diff --git a/dss-appconn/appconns/dss-eventchecker-appconn/pom.xml b/dss-appconn/appconns/dss-eventchecker-appconn/pom.xml new file mode 100644 index 000000000..1c5808677 --- /dev/null +++ b/dss-appconn/appconns/dss-eventchecker-appconn/pom.xml @@ -0,0 +1,171 @@ + + + + + + dss + com.webank.wedatasphere.dss + 1.0.0 + ../../../pom.xml + + 4.0.0 + + dss-eventchecker-appconn + + + + + com.webank.wedatasphere.dss + dss-appconn-core + ${dss.version} + + + + com.webank.wedatasphere.dss + dss-development-process-standard + ${dss.version} + + + + com.webank.wedatasphere.dss + dss-development-process-standard-execution + ${dss.version} + + + + org.apache.commons + commons-lang3 + 3.4 + + + + com.alibaba + druid + 1.0.28 + + + + log4j + log4j + 1.2.17 + + + + com.webank.wedatasphere.linkis + linkis-cs-client + ${linkis.version} + + + linkis-common + com.webank.wedatasphere.linkis + + + json4s-jackson_2.11 + org.json4s + + + + + + com.webank.wedatasphere.linkis + linkis-storage + ${linkis.version} + provided + + + com.webank.wedatasphere.dss + dss-common + ${dss.version} + provided + + + + + + + org.apache.maven.plugins + maven-deploy-plugin + + + net.alchim31.maven + scala-maven-plugin + + + org.apache.maven.plugins + maven-jar-plugin + + + org.apache.maven.plugins + maven-compiler-plugin + 3.3 + + 1.8 + 1.8 + UTF-8 + + + + org.apache.maven.plugins + maven-assembly-plugin + 2.3 + false + + + make-assembly + package + + single + + + + src/main/assembly/distribution.xml + + + + + + false + out + false + false + + src/main/assembly/distribution.xml + + + + + + + src/main/java + + **/*.xml + + + + src/main/resources + + **/*.properties + **/application.yml + **/bootstrap.yml + **/log4j2.xml + + + + + \ No newline at end of file diff --git a/dss-appconn/appconns/dss-eventchecker-appconn/src/main/assembly/distribution.xml b/dss-appconn/appconns/dss-eventchecker-appconn/src/main/assembly/distribution.xml new file mode 100644 index 000000000..a35e2cb15 --- /dev/null +++ b/dss-appconn/appconns/dss-eventchecker-appconn/src/main/assembly/distribution.xml @@ -0,0 +1,66 @@ + + + + dss-eventchecker-appconn + + dir + + true + eventchecker + + + + + + lib + true + true + false + true + true + + + + + + ${basedir}/src/main/resources + + appconn.properties + + 0777 + / + unix + + + + ${basedir}/src/main/resources + + log4j.properties + log4j2.xml + + 0777 + conf + unix + + + + + + diff --git a/dss-appconn/appconns/dss-eventchecker-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/eventchecker/EventCheckerAppConn.java b/dss-appconn/appconns/dss-eventchecker-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/eventchecker/EventCheckerAppConn.java new file mode 100644 index 000000000..bd277ca3e --- /dev/null +++ b/dss-appconn/appconns/dss-eventchecker-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/eventchecker/EventCheckerAppConn.java @@ -0,0 +1,38 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.eventchecker; + +import com.webank.wedatasphere.dss.appconn.core.ext.OnlyDevelopmentAppConn; +import com.webank.wedatasphere.dss.appconn.core.impl.AbstractAppConn; +import com.webank.wedatasphere.dss.appconn.eventchecker.standard.EventCheckerDevelopmentStandard; +import com.webank.wedatasphere.dss.standard.app.development.standard.DevelopmentIntegrationStandard; + +public class EventCheckerAppConn extends AbstractAppConn implements OnlyDevelopmentAppConn { + + private EventCheckerDevelopmentStandard standard; + + @Override + protected void initialize() { + standard = new EventCheckerDevelopmentStandard(); + } + + @Override + public DevelopmentIntegrationStandard getOrCreateDevelopmentStandard() { + return standard; + } + +} diff --git a/dss-appconn/appconns/dss-eventchecker-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/eventchecker/EventCheckerCompletedExecutionResponseRef.java b/dss-appconn/appconns/dss-eventchecker-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/eventchecker/EventCheckerCompletedExecutionResponseRef.java new file mode 100644 index 000000000..603766fbd --- /dev/null +++ b/dss-appconn/appconns/dss-eventchecker-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/eventchecker/EventCheckerCompletedExecutionResponseRef.java @@ -0,0 +1,50 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.eventchecker; + +import com.webank.wedatasphere.dss.standard.app.development.listener.common.CompletedExecutionResponseRef; + +import java.util.Map; + +public class EventCheckerCompletedExecutionResponseRef extends CompletedExecutionResponseRef { + private Exception exception; + public void setException(Exception exception) { + this.exception = exception; + } + + @Override + public Exception getException() { + return exception; + } + + public EventCheckerCompletedExecutionResponseRef(int status) { + super(status); + } + + public EventCheckerCompletedExecutionResponseRef(String responseBody, int status) { + super(responseBody, status); + } + + public void setStatus(int status){ + this.status = status; + } + + @Override + public Map toMap() { + return null; + } +} diff --git a/dss-appconn/appconns/dss-eventchecker-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/eventchecker/adapter/EventCheckAdapter.java b/dss-appconn/appconns/dss-eventchecker-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/eventchecker/adapter/EventCheckAdapter.java new file mode 100644 index 000000000..f7004bb38 --- /dev/null +++ b/dss-appconn/appconns/dss-eventchecker-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/eventchecker/adapter/EventCheckAdapter.java @@ -0,0 +1,29 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.eventchecker.adapter; + +import org.apache.log4j.Logger; + +import java.util.Properties; + +public interface EventCheckAdapter { + + boolean sendMsg(int jobId, Properties props, Logger log); + + boolean reciveMsg(int jobId, Properties props, Logger log); + +} diff --git a/eventchecker-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/schedulis/jobtype/connector/EventDruidFactory.java b/dss-appconn/appconns/dss-eventchecker-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/eventchecker/connector/EventDruidFactory.java similarity index 87% rename from eventchecker-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/schedulis/jobtype/connector/EventDruidFactory.java rename to dss-appconn/appconns/dss-eventchecker-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/eventchecker/connector/EventDruidFactory.java index fb1e3060d..3f0f0650b 100644 --- a/eventchecker-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/schedulis/jobtype/connector/EventDruidFactory.java +++ b/dss-appconn/appconns/dss-eventchecker-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/eventchecker/connector/EventDruidFactory.java @@ -1,8 +1,7 @@ /* * Copyright 2019 WeBank - * * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. + * you may not use this file except in compliance with the License. * You may obtain a copy of the License at * * http://www.apache.org/licenses/LICENSE-2.0 @@ -15,7 +14,7 @@ * */ -package com.webank.wedatasphere.dss.appjoint.schedulis.jobtype.connector; +package com.webank.wedatasphere.dss.appconn.eventchecker.connector; import com.alibaba.druid.pool.DruidDataSource; @@ -25,13 +24,6 @@ import java.util.Base64; import java.util.Properties; -/** - * @author georgeqiao - * @Title: EventDruidFactory - * @ProjectName Azkaban-EventChecker - * @date 2019/9/1822:10 - * @Description: TODO - */ public class EventDruidFactory { private static DruidDataSource msgInstance; @@ -55,14 +47,19 @@ private static DruidDataSource createDataSource(Properties props, Logger log, St String url = null; String username = null; String password = null; + String loginType = null; if(type.equals("Msg")){ name = props.getProperty("msg.eventchecker.jdo.option.name"); url = props.getProperty("msg.eventchecker.jdo.option.url"); username = props.getProperty("msg.eventchecker.jdo.option.username"); + loginType = props.getProperty("msg.eventchecker.jdo.option.login.type"); try { -// password = new String(Base64.getDecoder().decode(props.getProperty("msg.eventchecker.jdo.option.password").getBytes()),"UTF-8"); - password = props.getProperty("msg.eventchecker.jdo.option.password"); + if("base64".equals(loginType)) { + password = new String(Base64.getDecoder().decode(props.getProperty("msg.eventchecker.jdo.option.password").getBytes()), "UTF-8"); + }else{ + password = props.getProperty("msg.eventchecker.jdo.option.password"); + } } catch (Exception e){ log.error("password decore failed" + e); } diff --git a/dss-appconn/appconns/dss-eventchecker-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/eventchecker/cs/CSEventReceiverHelper.java b/dss-appconn/appconns/dss-eventchecker-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/eventchecker/cs/CSEventReceiverHelper.java new file mode 100644 index 000000000..74fea4ff2 --- /dev/null +++ b/dss-appconn/appconns/dss-eventchecker-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/eventchecker/cs/CSEventReceiverHelper.java @@ -0,0 +1,59 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.eventchecker.cs; + +import com.google.gson.Gson; +import com.webank.wedatasphere.linkis.cs.client.service.CSVariableService; +import com.webank.wedatasphere.linkis.cs.client.utils.ContextServiceUtils; +import com.webank.wedatasphere.linkis.cs.client.utils.SerializeHelper; +import com.webank.wedatasphere.linkis.cs.common.entity.enumeration.ContextScope; +import com.webank.wedatasphere.linkis.cs.common.entity.enumeration.ContextType; +import com.webank.wedatasphere.linkis.cs.common.entity.object.LinkisVariable; +import com.webank.wedatasphere.linkis.cs.common.entity.source.CommonContextKey; +import com.webank.wedatasphere.linkis.cs.common.entity.source.ContextKey; +import com.webank.wedatasphere.linkis.cs.common.utils.CSCommonUtils; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; + +import java.util.Properties; + +public class CSEventReceiverHelper { + + private static final Logger LOGGER = LoggerFactory.getLogger(CSEventReceiverHelper.class); + + private static Gson gson = new Gson(); + + public static void putVariable(Properties properties, String msgBody, String saveKey) { + String contextIDStr = ContextServiceUtils.getContextIDStrByProperties(properties); + String nodeNameStr = ContextServiceUtils.getNodeNameStrByProperties(properties); + try { + + String key = saveKey; + String value = msgBody; + ContextKey contextKey = new CommonContextKey(); + contextKey.setContextScope(ContextScope.PUBLIC); + contextKey.setContextType(ContextType.OBJECT); + contextKey.setKey(CSCommonUtils.getVariableKey(nodeNameStr, key)); + LinkisVariable varValue = new LinkisVariable(); + varValue.setKey(key); + varValue.setValue(value); + CSVariableService.getInstance().putVariable(contextIDStr, SerializeHelper.serializeContextKey(contextKey), varValue); + } catch (Exception e) { + LOGGER.error("Failed to put variable to cs", e); + } + } +} diff --git a/dss-appconn/appconns/dss-eventchecker-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/eventchecker/entity/EventChecker.java b/dss-appconn/appconns/dss-eventchecker-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/eventchecker/entity/EventChecker.java new file mode 100644 index 000000000..e3ddf7a47 --- /dev/null +++ b/dss-appconn/appconns/dss-eventchecker-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/eventchecker/entity/EventChecker.java @@ -0,0 +1,227 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.eventchecker.entity; + +import com.google.gson.Gson; +import com.webank.wedatasphere.dss.appconn.eventchecker.cs.CSEventReceiverHelper; +import com.webank.wedatasphere.dss.appconn.eventchecker.execution.EventCheckerExecutionAction; +import com.webank.wedatasphere.dss.appconn.eventchecker.service.EventCheckerService; +import com.webank.wedatasphere.dss.standard.app.development.listener.common.RefExecutionState; +import org.apache.commons.lang3.StringUtils; +import org.apache.log4j.Logger; +import java.util.HashMap; +import java.util.Map; +import java.util.Properties; +import java.util.regex.Matcher; +import java.util.regex.Pattern; + +public class EventChecker implements Runnable{ + public final static String WAIT_TIME = "max.receive.hours"; + public final static String WAIT_FOR_TIME = "wait.for.time"; + public final static String QUERY_FREQUENCY = "query.frequency"; + public final static String MSGTYPE="msg.type"; + public final static String SENDER="msg.sender"; + public final static String RECEIVER="msg.receiver"; + public final static String TOPIC="msg.topic"; + public final static String MSGNAME="msg.name"; + public final static String MSG="msg.body"; + public final static String EXEC_ID = "azkaban.flow.execid"; + public final static String SAVE_KEY="msg.savekey"; + public final static String USER_TIME="msg.init.querytime"; + public final static String TODAY="only.receive.today"; + public final static String AFTERSEND="msg.after.send"; + + private Properties p; + private String jobId; + private int execId; + private EventCheckerService wbDao=null; + EventCheckerExecutionAction backAction = null; + public Long maxWaitTime; + public int queryFrequency; + + private static Pattern pattern = Pattern.compile("[a-zA-Z_0-9@\\-]+"); + + private static final Logger logger = Logger.getRootLogger(); + + public EventChecker(Properties p, EventCheckerExecutionAction action) { + this.p = p; + this.jobId = "1"; + backAction = action; + String waitTime = p.getProperty(EventChecker.WAIT_TIME, "1"); + Double doubleWaitTime = Double.valueOf(waitTime) * 3600 * 1000; + maxWaitTime = Long.valueOf(doubleWaitTime.longValue()); + String query_frequency = p.getProperty(EventChecker.QUERY_FREQUENCY, "30000"); + queryFrequency = Integer.valueOf(query_frequency); + if(queryFrequency <10000){ + queryFrequency = 10000; + } + } + + @Override + public void run() { + try { + backAction.setState(RefExecutionState.Running); + if (p == null) { + throw new RuntimeException("Properties is null. Can't continue"); + } + if (checkParamMap(p, MSGTYPE)) { + throw new RuntimeException("parameter " + MSGTYPE + " can not be blank."); + } + if (checkParamMap(p, TOPIC)) { + throw new RuntimeException("parameter " + TOPIC + " can not be blank."); + } else { + String topic = p.getProperty(TOPIC); + if (!topic.matches("[^_]*_[^_]*_[^_]*")) { + throw new RuntimeException("Error format of topic parameter. Accept: XX_XX_XX."); + } + } + if (checkParamMap(p, MSGNAME)) { + throw new RuntimeException("parameter " + MSGNAME + " can not be blank."); + } + wbDao = EventCheckerService.getInstance(); + execId = Integer.parseInt(jobId); + boolean success = false; + if (p.getProperty(MSGTYPE).equals("SEND")) { + if (checkParamMap(p, SENDER)) { + throw new RuntimeException("parameter " + SENDER + " can not be blank."); + } else { + String sender = p.getProperty(SENDER); + if (!sender.matches("[^@]*@[^@]*@[^@]*")) { + throw new RuntimeException("Error format of sender parameter. Accept: XX@XX@XX."); + } + } + if (p.containsKey(MSG) && StringUtils.isNotEmpty(p.getProperty(MSG)) && p.getProperty(MSG).length() > 250) { + throw new RuntimeException("parameter " + MSG + " length less than 250 !"); + } + success = wbDao.sendMsg(execId, p, logger); + if (success) { + backAction.setState(RefExecutionState.Success); + + } else { + throw new RuntimeException("Failed Send message."); + } + }else if(p.getProperty(MSGTYPE).equals("RECEIVE")) { + backAction.eventType("RECEIVE"); + receiveMsg(); + } else + { + throw new RuntimeException("Please input correct parameter of msg.type, Select RECEIVE Or SEND."); + } + }catch (Exception ex){ + backAction.setState(RefExecutionState.Failed); + throw ex; + } + + } + + public boolean receiveMsg(){ + boolean success = false; + if(p.getProperty(MSGTYPE).equals("RECEIVE")) { + if (checkParamMap(p, RECEIVER)) { + backAction.setState(RefExecutionState.Failed); + throw new RuntimeException("parameter " + RECEIVER + " can not be blank."); + } else { + String receiver = p.getProperty(RECEIVER); + if (!receiver.matches("[^@]*@[^@]*@[^@]*")) { + backAction.setState(RefExecutionState.Failed); + throw new RuntimeException("Error format of receiver parameter. Accept: XX@XX@XX."); + } + } + String userTime = checkTimeParamMap(p, USER_TIME); + if (StringUtils.isNotEmpty(userTime)) { + p.put(USER_TIME, userTime); + } + success = wbDao.reciveMsg(execId, p, logger); + if (success) { + backAction.saveKeyAndValue(getJobSaveKeyAndValue()); + backAction.setState(RefExecutionState.Success); + } else { + backAction.setState(RefExecutionState.Running); + } + } + return success; + } + + public String getJobSaveKeyAndValue(){ + Map saveValueMap = new HashMap<>(); + String msgBody = p.getProperty(MSG, "{}"); + String saveKey = p.getProperty(SAVE_KEY,"msg.body"); + CSEventReceiverHelper.putVariable(this.p, msgBody, saveKey); + if(StringUtils.isEmpty(saveKey)){ + saveValueMap.put("msg.body", msgBody); + }else { + saveValueMap.put(saveKey, msgBody); + } + Gson gson = new Gson(); + String saveValueJson = gson.toJson(saveValueMap); + logger.info("Output msg body: "+saveValueJson); + return saveValueJson; + } + + public void cancel() throws InterruptedException { + } + + private boolean checkParamMap(Properties p, String key){ + boolean checkFlag = false; + if(!p.containsKey(key)){ + throw new RuntimeException("parameter " + key + " is Empty."); + } + if(p.containsKey(key)){ + if(StringUtils.isEmpty(p.getProperty(key))){ + checkFlag = true; + } + } + if(!MSG.equals(key) && StringUtils.contains(p.getProperty(key), " ")){ + throw new RuntimeException("parameter " + key + " can not contains space !"); + } + if(!checkNoStandardStr(p.getProperty(key))){ + throw new RuntimeException("parameter " + key + " Accept letter and number and _@- only."); + } + if(p.getProperty(key).length() > 45){ + throw new RuntimeException("parameter " + key + " length less than 45 !"); + } + return checkFlag; + } + + private boolean checkNoStandardStr(String param){ + Matcher matcher = pattern.matcher(param); + return matcher.matches(); + } + + private void checkTimeParam(Properties p, String key){ + if(p.containsKey(key)){ + String waitForTime= p.getProperty(key); + if(!waitForTime.matches("^(0?[0-9]|1[0-9]|2[0-3]):(0?[0-9]|[1-5][0-9])$")){ + throw new RuntimeException("Parameter " + key + " Time format error ! For example: HH:mm"); + } + } + } + + private String checkTimeParamMap(Properties p, String key){ + if(p.containsKey(key)){ + String userTime = p.getProperty(key); + Pattern ptime = Pattern.compile("^([1][7-9][0-9][0-9]|[2][0][0-9][0-9])(\\-)([0][1-9]|[1][0-2])(\\-)([0-2][1-9]|[3][0-1])(\\s)([0-1][0-9]|[2][0-3])(:)([0-5][0-9])(:)([0-5][0-9])$"); + Matcher m = ptime.matcher(userTime); + if(!m.matches()){ + throw new RuntimeException("Parameter " + key + " Time format error ! For example: yyyy-MM-dd HH:mm:ss"); + } + return userTime; + }else{ + return null; + } + } +} diff --git a/eventchecker-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/schedulis/jobtype/exception/UndefinedPropertyException.java b/dss-appconn/appconns/dss-eventchecker-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/eventchecker/exception/UndefinedPropertyException.java similarity index 84% rename from eventchecker-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/schedulis/jobtype/exception/UndefinedPropertyException.java rename to dss-appconn/appconns/dss-eventchecker-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/eventchecker/exception/UndefinedPropertyException.java index d92be51c3..fe86138ea 100644 --- a/eventchecker-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/schedulis/jobtype/exception/UndefinedPropertyException.java +++ b/dss-appconn/appconns/dss-eventchecker-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/eventchecker/exception/UndefinedPropertyException.java @@ -1,8 +1,7 @@ /* * Copyright 2019 WeBank - * * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. + * you may not use this file except in compliance with the License. * You may obtain a copy of the License at * * http://www.apache.org/licenses/LICENSE-2.0 @@ -15,7 +14,7 @@ * */ -package com.webank.wedatasphere.dss.appjoint.schedulis.jobtype.exception; +package com.webank.wedatasphere.dss.appconn.eventchecker.exception; /** * Indicates that a required property is missing from the Props diff --git a/dss-appconn/appconns/dss-eventchecker-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/eventchecker/service/AbstractEventCheck.java b/dss-appconn/appconns/dss-eventchecker-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/eventchecker/service/AbstractEventCheck.java new file mode 100644 index 000000000..aa63a87f7 --- /dev/null +++ b/dss-appconn/appconns/dss-eventchecker-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/eventchecker/service/AbstractEventCheck.java @@ -0,0 +1,161 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.eventchecker.service; + +import com.alibaba.druid.pool.DruidDataSource; +import com.webank.wedatasphere.dss.appconn.eventchecker.connector.EventDruidFactory; +import com.webank.wedatasphere.dss.appconn.eventchecker.adapter.EventCheckAdapter; +import com.webank.wedatasphere.dss.appconn.eventchecker.entity.EventChecker; + +import org.apache.log4j.Logger; + +import java.net.InetAddress; +import java.net.NetworkInterface; +import java.net.SocketException; +import java.sql.Connection; +import java.sql.PreparedStatement; +import java.sql.ResultSet; +import java.sql.SQLException; +import java.util.Enumeration; +import java.util.Properties; + +import javax.sql.DataSource; + +public abstract class AbstractEventCheck implements EventCheckAdapter { + static DataSource msgDS; + String topic; + String msgName; + String receiver; + String sender; + String receiveToday; + String userTime; + String waitTime; + String query_frequency; + String wait_for_time; + String msg; + String afterSend; + + DataSource getMsgDS(Properties props, Logger log){ + if (msgDS == null) { + msgDS = EventDruidFactory.getMsgInstance(props, log); + if (msgDS == null) { + log.error("Error getting Druid DataSource instance"); + } + } + return msgDS; + } + + void initECParams(Properties props){ + topic = props.getProperty(EventChecker.TOPIC); + msgName = props.getProperty(EventChecker.MSGNAME); + receiver = props.getProperty(EventChecker.RECEIVER); + sender = props.getProperty(EventChecker.SENDER); + msg = props.getProperty(EventChecker.MSG); + receiveToday = props.getProperty(EventChecker.TODAY); + userTime = props.getProperty(EventChecker.USER_TIME); + waitTime = props.getProperty(EventChecker.WAIT_TIME, "1"); + query_frequency = props.getProperty(EventChecker.QUERY_FREQUENCY, "30000"); + afterSend = props.getProperty(EventChecker.AFTERSEND); + } + + Connection getEventCheckerConnection(Properties props, Logger log){ + Connection connection = null; + try { + connection = getMsgDS(props,log).getConnection(); + } catch (SQLException e) { + throw new RuntimeException("Error getting DB Connection instance {} " + e); + } + return connection; + } + + @Override + public boolean sendMsg(int jobId, Properties props, Logger log) { + return false; + } + + @Override + public boolean reciveMsg(int jobId, Properties props, Logger log) { + return false; + } + + void closeConnection(Connection conn, Logger log) { + if (conn != null) { + try { + conn.close(); + } catch (SQLException e) { + log.error("Error closing connection", e); + } + } + } + + void closeQueryRef(ResultSet rs, Logger log) { + if (rs != null) { + try { + rs.close(); + } catch (SQLException e) { + log.error("Error closing result set", e); + } + } + + } + + void closeQueryStmt(PreparedStatement stmt, Logger log) { + if (stmt != null) { + try { + stmt.close(); + } catch (SQLException e) { + log.error("Error closing result stmt", e); + } + } + + } + + + public static void closeDruidDataSource() { + DruidDataSource msgDSObject = (DruidDataSource) msgDS; + if (msgDSObject != null) { + msgDSObject.close(); + } + + } + + String getLinuxLocalIp(Logger log) { + String ip = "127.0.0.1"; + try { + for (Enumeration en = NetworkInterface.getNetworkInterfaces(); en.hasMoreElements(); ) { + NetworkInterface intf = en.nextElement(); + String name = intf.getName(); + if (!name.contains("docker") && !name.contains("lo")) { + for (Enumeration enumIpAddr = intf.getInetAddresses(); enumIpAddr.hasMoreElements(); ) { + InetAddress inetAddress = enumIpAddr.nextElement(); + if (!inetAddress.isLoopbackAddress()) { + String ipaddress = inetAddress.getHostAddress().toString(); + if (!ipaddress.contains("::") && !ipaddress.contains("0:0:") && !ipaddress.contains("fe80")) { + ip = ipaddress; + } + } + } + } + } + } catch (SocketException ex) { + log.warn("get ip failed", ex); + + } + log.info("Send IP:" + ip); + return ip; + } +} diff --git a/dss-appconn/appconns/dss-eventchecker-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/eventchecker/service/AbstractEventCheckReceiver.java b/dss-appconn/appconns/dss-eventchecker-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/eventchecker/service/AbstractEventCheckReceiver.java new file mode 100644 index 000000000..dd48920fd --- /dev/null +++ b/dss-appconn/appconns/dss-eventchecker-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/eventchecker/service/AbstractEventCheckReceiver.java @@ -0,0 +1,176 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.eventchecker.service; + +import com.webank.wedatasphere.dss.appconn.eventchecker.entity.EventChecker; + +import org.apache.commons.lang3.StringUtils; +import org.apache.commons.lang3.time.DateFormatUtils; +import org.apache.log4j.Logger; + +import java.sql.Connection; +import java.sql.PreparedStatement; +import java.sql.ResultSet; +import java.sql.SQLException; +import java.util.Date; +import java.util.Properties; + +public class AbstractEventCheckReceiver extends AbstractEventCheck{ + /** + * Fill the result into the source + */ + String setConsumedMsg(Properties props, Logger log, String[] consumedMsgInfo){ + String vNewMsgID = ""; + try { + if(consumedMsgInfo!=null && consumedMsgInfo.length == 4){ + vNewMsgID = consumedMsgInfo[0]; + String vMsgName = consumedMsgInfo[1]; + String vSender = consumedMsgInfo[2]; + String vMsg = consumedMsgInfo[3]; + if (null == vMsg) { + props.put(EventChecker.MSG, "NULL"); + } else { + props.put(EventChecker.MSG, vMsg); + } + log.info("Received message : messageID: " + vNewMsgID + ", messageName: " + vMsgName + ", receiver: " + vSender + + ", messageBody: " + vMsg); + } + }catch (Exception e) { + log.error("Error set consumed message failed {} setConsumedMsg failed" + e); + return vNewMsgID; + } + return vNewMsgID; + } + + /** + * Update consumption status + */ + boolean updateMsgOffset(int jobId, Properties props, Logger log, String[] consumedMsgInfo,String lastMsgId){ + boolean result = false; + String vNewMsgID = "-1"; + PreparedStatement updatePstmt = null; + Connection msgConn = null; + vNewMsgID = setConsumedMsg(props,log,consumedMsgInfo); + try { + if(StringUtils.isNotEmpty(vNewMsgID) && StringUtils.isNotBlank(vNewMsgID) && !"-1".equals(vNewMsgID)){ + msgConn = getEventCheckerConnection(props,log); + if(msgConn == null) return false; + int vProcessID = jobId; + String vReceiveTime = DateFormatUtils.format(new Date(), "yyyy-MM-dd HH:mm:ss");; + String sqlForUpdateMsg = "INSERT INTO event_status(receiver,topic,msg_name,receive_time,msg_id) VALUES(?,?,?,?,?) ON DUPLICATE KEY UPDATE receive_time=VALUES(receive_time),msg_id= CASE WHEN msg_id= " + lastMsgId + " THEN VALUES(msg_id) ELSE msg_id END"; + log.info("last message offset {} is:" + lastMsgId); + updatePstmt = msgConn.prepareCall(sqlForUpdateMsg); + updatePstmt.setString(1, receiver); + updatePstmt.setString(2, topic); + updatePstmt.setString(3, msgName); + updatePstmt.setString(4, vReceiveTime); + updatePstmt.setString(5, vNewMsgID); + int updaters = updatePstmt.executeUpdate(); + log.info("updateMsgOffset successful {} update result is:" + updaters); + if(updaters != 0){ + log.info("Received message successfully , update message status succeeded, consumed flow execution ID: " + vProcessID); + //return true after update success + result = true; + }else{ + log.info("Received message successfully , update message status failed, consumed flow execution ID: " + vProcessID); + result = false; + } + }else{ + result = false; + } + }catch (SQLException e){ + log.error("Error update Msg Offset" + e); + return false; + }finally { + closeQueryStmt(updatePstmt, log); + closeConnection(msgConn, log); + } + return result; + } + + /** + * get consumption progress + */ + String getOffset(int jobId, Properties props, Logger log){ + String sqlForReadMsgID = "SELECT msg_id FROM event_status WHERE receiver=? AND topic=? AND msg_name=?"; + PreparedStatement pstmtForGetID = null; + Connection msgConn = null; + ResultSet rs = null; + boolean flag = false; + String lastMsgId = "0"; + try { + msgConn = getEventCheckerConnection(props,log); + pstmtForGetID = msgConn.prepareCall(sqlForReadMsgID); + pstmtForGetID.setString(1, receiver); + pstmtForGetID.setString(2, topic); + pstmtForGetID.setString(3, msgName); + rs = pstmtForGetID.executeQuery(); + lastMsgId = rs.last()==true ? rs.getString("msg_id"):"0"; + } catch (SQLException e) { + throw new RuntimeException("get Offset failed " + e); + }finally { + closeQueryStmt(pstmtForGetID,log); + closeConnection(msgConn,log); + closeQueryRef(rs,log); + } + log.info("The last record id was " + lastMsgId); + return lastMsgId; + } + + /** + * Consistent entrance to consumer message + */ + String[] getMsg(Properties props, Logger log,String ... params){ + String sqlForReadTMsg = "SELECT * FROM event_queue WHERE topic=? AND msg_name=? AND send_time >=? AND send_time <=? AND msg_id >? ORDER BY msg_id ASC LIMIT 1"; + PreparedStatement pstmt = null; + Connection msgConn = null; + ResultSet rs = null; + String[] consumedMsgInfo = null; + try { + msgConn = getEventCheckerConnection(props,log); + pstmt = msgConn.prepareCall(sqlForReadTMsg); + pstmt.setString(1, topic); + pstmt.setString(2, msgName); + pstmt.setString(3, params[0]); + pstmt.setString(4, params[1]); + pstmt.setString(5, params[2]); + log.info("param {} StartTime: " + params[0] + ", EndTime: " + params[1] + + ", Topic: " + topic + ", MessageName: " + msgName + ", LastMessageID: " + params[2]); + rs = pstmt.executeQuery(); + + if(rs.last()){ + consumedMsgInfo = new String[4]; + String[] msgKey = new String[]{"msg_id","msg_name","sender","msg"}; + for (int i = 0;i <= 3;i++) { + consumedMsgInfo[i] = rs.getString(msgKey[i]); + } + } + } catch (SQLException e) { + throw new RuntimeException("EventChecker failed to receive message" + e); + } finally { + closeQueryStmt(pstmt, log); + closeConnection(msgConn, log); + closeQueryRef(rs, log); + } + return consumedMsgInfo; + } + + @Override + public boolean reciveMsg(int jobId, Properties props, Logger log) { + return super.reciveMsg(jobId, props, log); + } +} diff --git a/eventchecker-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/schedulis/jobtype/service/DefaultEventcheckReceiver.java b/dss-appconn/appconns/dss-eventchecker-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/eventchecker/service/DefaultEventcheckReceiver.java similarity index 93% rename from eventchecker-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/schedulis/jobtype/service/DefaultEventcheckReceiver.java rename to dss-appconn/appconns/dss-eventchecker-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/eventchecker/service/DefaultEventcheckReceiver.java index 2759256b9..9fa17274f 100644 --- a/eventchecker-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/schedulis/jobtype/service/DefaultEventcheckReceiver.java +++ b/dss-appconn/appconns/dss-eventchecker-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/eventchecker/service/DefaultEventcheckReceiver.java @@ -1,8 +1,7 @@ /* * Copyright 2019 WeBank - * * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. + * you may not use this file except in compliance with the License. * You may obtain a copy of the License at * * http://www.apache.org/licenses/LICENSE-2.0 @@ -15,11 +14,10 @@ * */ -package com.webank.wedatasphere.dss.appjoint.schedulis.jobtype.service; +package com.webank.wedatasphere.dss.appconn.eventchecker.service; -import com.webank.wedatasphere.dss.appjoint.schedulis.jobtype.entity.EventChecker; import org.apache.commons.lang3.time.DateFormatUtils; import org.apache.log4j.Logger; @@ -29,13 +27,6 @@ import java.util.Date; import java.util.Properties; -/** - * @author georgeqiao - * @Title: DefaultEventcheckReceiver - * @ProjectName Azkaban-EventChecker - * @date 2019/9/1822:10 - * @Description: TODO - */ public class DefaultEventcheckReceiver extends AbstractEventCheckReceiver { String todayStartTime; String todayEndTime; diff --git a/eventchecker-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/schedulis/jobtype/service/EventCheckSender.java b/dss-appconn/appconns/dss-eventchecker-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/eventchecker/service/EventCheckSender.java similarity index 89% rename from eventchecker-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/schedulis/jobtype/service/EventCheckSender.java rename to dss-appconn/appconns/dss-eventchecker-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/eventchecker/service/EventCheckSender.java index 662234ec4..8e21ab72d 100644 --- a/eventchecker-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/schedulis/jobtype/service/EventCheckSender.java +++ b/dss-appconn/appconns/dss-eventchecker-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/eventchecker/service/EventCheckSender.java @@ -1,8 +1,7 @@ /* * Copyright 2019 WeBank - * * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. + * you may not use this file except in compliance with the License. * You may obtain a copy of the License at * * http://www.apache.org/licenses/LICENSE-2.0 @@ -15,7 +14,7 @@ * */ -package com.webank.wedatasphere.dss.appjoint.schedulis.jobtype.service; +package com.webank.wedatasphere.dss.appconn.eventchecker.service; import org.apache.commons.lang3.time.DateFormatUtils; import org.apache.log4j.Logger; @@ -26,13 +25,6 @@ import java.util.Date; import java.util.Properties; -/** - * @author georgeqiao - * @Title: EventCheckSender - * @ProjectName Azkaban-EventChecker - * @date 2019/9/1822:10 - * @Description: TODO - */ public class EventCheckSender extends AbstractEventCheck { public EventCheckSender(Properties props) { diff --git a/dss-appconn/appconns/dss-eventchecker-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/eventchecker/service/EventCheckerExecuteService.java b/dss-appconn/appconns/dss-eventchecker-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/eventchecker/service/EventCheckerExecuteService.java new file mode 100644 index 000000000..0aeef73cb --- /dev/null +++ b/dss-appconn/appconns/dss-eventchecker-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/eventchecker/service/EventCheckerExecuteService.java @@ -0,0 +1,32 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.eventchecker.service; + +import com.webank.wedatasphere.dss.appconn.eventchecker.execution.EventCheckerRefExecutionOperation; +import com.webank.wedatasphere.dss.standard.app.development.operation.RefExecutionOperation; +import com.webank.wedatasphere.dss.standard.app.development.service.AbstractRefExecutionService; + +public class EventCheckerExecuteService extends AbstractRefExecutionService { + + @Override + public RefExecutionOperation createRefExecutionOperation() { + EventCheckerRefExecutionOperation eventCheckerRefExecutionOperation = new EventCheckerRefExecutionOperation(); + eventCheckerRefExecutionOperation.setDevelopmentService(this); + return eventCheckerRefExecutionOperation; + } + +} diff --git a/eventchecker-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/schedulis/jobtype/service/EventCheckerService.java b/dss-appconn/appconns/dss-eventchecker-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/eventchecker/service/EventCheckerService.java similarity index 77% rename from eventchecker-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/schedulis/jobtype/service/EventCheckerService.java rename to dss-appconn/appconns/dss-eventchecker-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/eventchecker/service/EventCheckerService.java index 7f75170f5..2ca27bec3 100644 --- a/eventchecker-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/schedulis/jobtype/service/EventCheckerService.java +++ b/dss-appconn/appconns/dss-eventchecker-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/eventchecker/service/EventCheckerService.java @@ -1,8 +1,7 @@ /* * Copyright 2019 WeBank - * * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. + * you may not use this file except in compliance with the License. * You may obtain a copy of the License at * * http://www.apache.org/licenses/LICENSE-2.0 @@ -15,19 +14,12 @@ * */ -package com.webank.wedatasphere.dss.appjoint.schedulis.jobtype.service; +package com.webank.wedatasphere.dss.appconn.eventchecker.service; import org.apache.log4j.Logger; import java.util.Properties; -/** - * @author georgeqiao - * @Title: EventCheckerService - * @ProjectName Azkaban-EventChecker - * @date 2019/9/1822:10 - * @Description: TODO - */ public class EventCheckerService { private static EventCheckerService instance; @@ -52,8 +44,6 @@ public boolean sendMsg(int jobId, Properties props, Logger log) { } /** - * 接收消息 接收消息先查询消费记录,有则从上一次消费后开始消费,没有则从任务启动时间点后开始消费。 - * 接收消息是以主动查询的方式进行的,在没有超出设定目标的时间内,反复查询目标消息。 * Receiving a message first queries the consumption record, * and then starts to consume after the last consumption, and no consumption * starts after the job starts. The received message is performed in an active diff --git a/dss-appconn/appconns/dss-eventchecker-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/eventchecker/standard/EventCheckerDevelopmentStandard.java b/dss-appconn/appconns/dss-eventchecker-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/eventchecker/standard/EventCheckerDevelopmentStandard.java new file mode 100644 index 000000000..597ea95e3 --- /dev/null +++ b/dss-appconn/appconns/dss-eventchecker-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/eventchecker/standard/EventCheckerDevelopmentStandard.java @@ -0,0 +1,46 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.eventchecker.standard; + +import com.webank.wedatasphere.dss.appconn.eventchecker.service.EventCheckerExecuteService; +import com.webank.wedatasphere.dss.standard.app.development.standard.OnlyExecutionDevelopmentStandard; +import com.webank.wedatasphere.dss.standard.app.development.service.RefExecutionService; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; + +public class EventCheckerDevelopmentStandard extends OnlyExecutionDevelopmentStandard { + + private static final Logger LOGGER = LoggerFactory.getLogger(EventCheckerDevelopmentStandard.class); + + @Override + protected RefExecutionService createRefExecutionService() { + return new EventCheckerExecuteService(); + } + + + @Override + public void init() { + LOGGER.info("class EventCheckerDevelopmentStandard init"); + } + + + @Override + public String getStandardName() { + return "EventCheckDevelopmentStandard"; + } + +} diff --git a/eventchecker-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/schedulis/jobtype/utils/Props.java b/dss-appconn/appconns/dss-eventchecker-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/eventchecker/utils/Props.java similarity index 98% rename from eventchecker-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/schedulis/jobtype/utils/Props.java rename to dss-appconn/appconns/dss-eventchecker-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/eventchecker/utils/Props.java index 1e19756a5..e4e8cc9b8 100644 --- a/eventchecker-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/schedulis/jobtype/utils/Props.java +++ b/dss-appconn/appconns/dss-eventchecker-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/eventchecker/utils/Props.java @@ -1,8 +1,7 @@ /* * Copyright 2019 WeBank - * * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. + * you may not use this file except in compliance with the License. * You may obtain a copy of the License at * * http://www.apache.org/licenses/LICENSE-2.0 @@ -15,9 +14,9 @@ * */ -package com.webank.wedatasphere.dss.appjoint.schedulis.jobtype.utils; +package com.webank.wedatasphere.dss.appconn.eventchecker.utils; -import com.webank.wedatasphere.dss.appjoint.schedulis.jobtype.exception.UndefinedPropertyException; +import com.webank.wedatasphere.dss.appconn.eventchecker.exception.UndefinedPropertyException; import org.apache.log4j.Logger; @@ -163,11 +162,6 @@ public static Props clone(final Props p) { return copyNext(p); } - /** - * - * @param source - * @return - */ private static Props copyNext(final Props source) { Props priorNodeCopy = null; if (source.getParent() != null) { @@ -181,14 +175,9 @@ private static Props copyNext(final Props source) { return dest; } - /** - * - * @param inputStream - * @throws IOException - */ private void loadFrom(final InputStream inputStream) throws IOException { final Properties properties = new Properties(); - //解决.job文件中包含中文,读取乱码的问题。 + // Solve the problem that the. Job file contains Chinese and reads garbled code. BufferedReader bufferedReader = new BufferedReader(new InputStreamReader(inputStream , "UTF-8")); properties.load(bufferedReader); this.put(properties); @@ -793,8 +782,6 @@ public void logProperties(final Logger logger, final String comment) { } } - /** - */ @Override public boolean equals(final Object o) { if (o == this) { @@ -827,9 +814,6 @@ public boolean equalsProps(final Props p) { return myKeySet.size() == p.getKeySet().size(); } - /** - * - */ @Override public int hashCode() { int code = this._current.hashCode(); @@ -839,9 +823,6 @@ public int hashCode() { return code; } - /** - * - */ @Override public String toString() { final StringBuilder builder = new StringBuilder("{"); diff --git a/eventchecker-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/schedulis/jobtype/utils/Utils.java b/dss-appconn/appconns/dss-eventchecker-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/eventchecker/utils/Utils.java similarity index 86% rename from eventchecker-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/schedulis/jobtype/utils/Utils.java rename to dss-appconn/appconns/dss-eventchecker-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/eventchecker/utils/Utils.java index 682033e55..a53dbf27e 100644 --- a/eventchecker-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/schedulis/jobtype/utils/Utils.java +++ b/dss-appconn/appconns/dss-eventchecker-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/eventchecker/utils/Utils.java @@ -1,8 +1,7 @@ /* * Copyright 2019 WeBank - * * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. + * you may not use this file except in compliance with the License. * You may obtain a copy of the License at * * http://www.apache.org/licenses/LICENSE-2.0 @@ -15,7 +14,7 @@ * */ -package com.webank.wedatasphere.dss.appjoint.schedulis.jobtype.utils; +package com.webank.wedatasphere.dss.appconn.eventchecker.utils; /** * A util helper class full of static methods that are commonly used. diff --git a/dss-appconn/appconns/dss-eventchecker-appconn/src/main/resources/appjoint.properties b/dss-appconn/appconns/dss-eventchecker-appconn/src/main/resources/appjoint.properties new file mode 100644 index 000000000..9538027d4 --- /dev/null +++ b/dss-appconn/appconns/dss-eventchecker-appconn/src/main/resources/appjoint.properties @@ -0,0 +1,24 @@ +# +# Copyright 2019 WeBank +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# +# + +msg.eventchecker.jdo.option.name=msg +msg.eventchecker.jdo.option.url=jdbc:mysql://127.0..0.1:3306/ +msg.eventchecker.jdo.option.username=user +msg.eventchecker.jdo.option.password= +msg.eventchecker.jdo.option.login.type=base64 + + + diff --git a/dss-appconn/appconns/dss-eventchecker-appconn/src/main/resources/log4j.properties b/dss-appconn/appconns/dss-eventchecker-appconn/src/main/resources/log4j.properties new file mode 100644 index 000000000..ee8619595 --- /dev/null +++ b/dss-appconn/appconns/dss-eventchecker-appconn/src/main/resources/log4j.properties @@ -0,0 +1,36 @@ +# +# Copyright 2019 WeBank +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# +# + +### set log levels ### + +log4j.rootCategory=INFO,console + +log4j.appender.console=org.apache.log4j.ConsoleAppender +log4j.appender.console.Threshold=INFO +log4j.appender.console.layout=org.apache.log4j.PatternLayout +#log4j.appender.console.layout.ConversionPattern= %d{ISO8601} %-5p (%t) [%F:%M(%L)] - %m%n +log4j.appender.console.layout.ConversionPattern= %d{ISO8601} %-5p (%t) %p %c{1} - %m%n + + +log4j.appender.com.webank.bdp.ide.core=org.apache.log4j.DailyRollingFileAppender +log4j.appender.com.webank.bdp.ide.core.Threshold=INFO +log4j.additivity.com.webank.bdp.ide.core=false +log4j.appender.com.webank.bdp.ide.core.layout=org.apache.log4j.PatternLayout +log4j.appender.com.webank.bdp.ide.core.Append=true +log4j.appender.com.webank.bdp.ide.core.File=logs/linkis.log +log4j.appender.com.webank.bdp.ide.core.layout.ConversionPattern= %d{ISO8601} %-5p (%t) [%F:%M(%L)] - %m%n + +log4j.logger.org.springframework=INFO diff --git a/dss-appconn/appconns/dss-eventchecker-appconn/src/main/resources/log4j2.xml b/dss-appconn/appconns/dss-eventchecker-appconn/src/main/resources/log4j2.xml new file mode 100644 index 000000000..8c40a73e8 --- /dev/null +++ b/dss-appconn/appconns/dss-eventchecker-appconn/src/main/resources/log4j2.xml @@ -0,0 +1,38 @@ + + + + + + + + + + + + + + + + + + + + + + + diff --git a/dss-appconn/appconns/dss-eventchecker-appconn/src/main/scala/com/webank/wedatasphere/dss/appconn/eventchecker/execution/EventCheckerExecutionAction.scala b/dss-appconn/appconns/dss-eventchecker-appconn/src/main/scala/com/webank/wedatasphere/dss/appconn/eventchecker/execution/EventCheckerExecutionAction.scala new file mode 100644 index 000000000..e449f6453 --- /dev/null +++ b/dss-appconn/appconns/dss-eventchecker-appconn/src/main/scala/com/webank/wedatasphere/dss/appconn/eventchecker/execution/EventCheckerExecutionAction.scala @@ -0,0 +1,63 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.eventchecker.execution + +import com.webank.wedatasphere.dss.appconn.eventchecker.EventCheckerCompletedExecutionResponseRef +import com.webank.wedatasphere.dss.appconn.eventchecker.entity.EventChecker +import com.webank.wedatasphere.dss.standard.app.development.listener.common.{AbstractRefExecutionAction, LongTermRefExecutionAction, RefExecutionState} + +class EventCheckerExecutionAction extends AbstractRefExecutionAction with LongTermRefExecutionAction { + private[this] var _state: RefExecutionState = _ + private var schedulerId: Int = _ + + def state: RefExecutionState = _state + + def setState(value: RefExecutionState): Unit = { + _state = value + } + + + val response = new EventCheckerCompletedExecutionResponseRef(200) + + private[this] var _saveKeyAndValue: String = null + + def saveKeyAndValue: String = _saveKeyAndValue + + def saveKeyAndValue(value: String): Unit = { + _saveKeyAndValue = value + } + + private[this] var _eventType: String = "SEND" + + def eventType: String = _eventType + + def eventType(value: String): Unit = { + _eventType = value + } + + private[this] var _ec: EventChecker = null + + def ec: EventChecker = _ec + + def setEc(value: EventChecker): Unit = { + _ec = value + } + + override def setSchedulerId(schedulerId: Int): Unit = this.schedulerId = schedulerId + + override def getSchedulerId: Int = schedulerId +} diff --git a/dss-appconn/appconns/dss-eventchecker-appconn/src/main/scala/com/webank/wedatasphere/dss/appconn/eventchecker/execution/EventCheckerRefExecutionOperation.scala b/dss-appconn/appconns/dss-eventchecker-appconn/src/main/scala/com/webank/wedatasphere/dss/appconn/eventchecker/execution/EventCheckerRefExecutionOperation.scala new file mode 100644 index 000000000..bfd8a2593 --- /dev/null +++ b/dss-appconn/appconns/dss-eventchecker-appconn/src/main/scala/com/webank/wedatasphere/dss/appconn/eventchecker/execution/EventCheckerRefExecutionOperation.scala @@ -0,0 +1,178 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.eventchecker.execution + + + + +import java.util.{Properties, UUID} + +import com.webank.wedatasphere.dss.appconn.eventchecker.EventCheckerCompletedExecutionResponseRef +import com.webank.wedatasphere.dss.appconn.eventchecker.entity.EventChecker +import com.webank.wedatasphere.dss.standard.app.development.listener.{ExecutionLogListener, ExecutionResultListener} +import com.webank.wedatasphere.dss.standard.app.development.listener.common.{AsyncExecutionRequestRef, AsyncExecutionResponseRef, CompletedExecutionResponseRef, RefExecutionAction, RefExecutionState} +import com.webank.wedatasphere.dss.standard.app.development.listener.core.{Killable, LongTermRefExecutionOperation, Procedure} +import com.webank.wedatasphere.dss.standard.app.development.ref.ExecutionRequestRef +import com.webank.wedatasphere.dss.standard.app.development.service.DevelopmentService +import com.webank.wedatasphere.linkis.common.log.LogUtils +import com.webank.wedatasphere.linkis.common.utils.Utils +import com.webank.wedatasphere.linkis.storage.LineRecord +import org.apache.commons.io.IOUtils +import org.slf4j.LoggerFactory; + + +class EventCheckerRefExecutionOperation extends LongTermRefExecutionOperation with Killable with Procedure{ + + + + private var service:DevelopmentService = _ + private val logger = LoggerFactory.getLogger(classOf[EventCheckerRefExecutionOperation]) + + + + + override def progress(action: RefExecutionAction): Float = { + //temp set + 0.5f + } + + override def log(action: RefExecutionAction): String = { + action match { + case action: EventCheckerExecutionAction => { + if (!action.state.isCompleted) { + LogUtils.generateInfo("EventChecker is sending or waiting for message") + } else { + LogUtils.generateInfo("EventChecker successfully received or send message") + } + } + case _ => LogUtils.generateERROR("Error NodeExecutionAction for log") + } + } + + override def kill(action: RefExecutionAction): Boolean = action match { + case longTermAction: EventCheckerExecutionAction => + longTermAction.setKilledFlag(true) + longTermAction.setState(RefExecutionState.Killed) + true + } + + protected def putErrorMsg(errorMsg: String, t: Throwable, action: EventCheckerExecutionAction): EventCheckerExecutionAction = t match { + + case t: Exception => + val response = action.response + response.setErrorMsg(errorMsg) + response.setException(t) + response.setIsSucceed(false) + action + } + + override def submit(requestRef: ExecutionRequestRef): RefExecutionAction = { + val asyncExecutionRequestRef = requestRef.asInstanceOf[AsyncExecutionRequestRef] + val nodeAction = new EventCheckerExecutionAction() + nodeAction.setId(UUID.randomUUID().toString()) + import scala.collection.JavaConversions.mapAsScalaMap + val InstanceConfig = this.service.getAppInstance.getConfig + val scalaParams: scala.collection.mutable.Map[String, Object] =asyncExecutionRequestRef.getExecutionRequestRefContext().getRuntimeMap() + val properties = new Properties() + InstanceConfig.foreach { record => + logger.info("request params key : " + record._1 + ",value : " + record._2) + if(null == record._2){ + properties.put(record._1, "")} + else { + properties.put(record._1, record._2.toString) + } + } + scalaParams.foreach { case (key, value) => + if (key != null && value != null) properties.put(key, value.toString) + } + Utils.tryCatch({ + val ec = new EventChecker(properties, nodeAction) + ec.run() + nodeAction.setEc(ec) + })(t => { + logger.error("EventChecker run failed for " + t.getMessage, t) + putErrorMsg("EventChecker run failed!" + t.getMessage, t, nodeAction) + }) + + nodeAction + } + + override def state(action: RefExecutionAction): RefExecutionState = { + action match { + case action: EventCheckerExecutionAction => { + action.getExecutionRequestRefContext.appendLog("EventCheck is running!") + if (action.state.isCompleted) return action.state + if (action.eventType.equals("RECEIVE")) { + Utils.tryCatch(action.ec.receiveMsg())(t => { + action.setState(RefExecutionState.Failed) + logger.error("EventChecker run failed for " + t.getMessage, t) + putErrorMsg("EventChecker run failed!" + t.getMessage, t, action) + false + }) + } + action.state + } + case _ => RefExecutionState.Failed + } + } + + override def result(action: RefExecutionAction): CompletedExecutionResponseRef = { + val response = new EventCheckerCompletedExecutionResponseRef(200) + action match { + case action: EventCheckerExecutionAction => { + if (action.state.equals(RefExecutionState.Success)) { + val resultSetWriter =action.getExecutionRequestRefContext.createTextResultSetWriter() + var resultStr = "EventChecker runs successfully!" + if (action.saveKeyAndValue != null) { + resultStr = action.saveKeyAndValue + logger.info("EventChecker save receive value: " + resultStr) + } + Utils.tryFinally { + resultSetWriter.addMetaData(null) + resultSetWriter.addRecord(new LineRecord(resultStr)) + }(IOUtils.closeQuietly(resultSetWriter)) + response.setIsSucceed(true) + } else { + response.setException(action.response.getException) + response.setIsSucceed(false) + } + response + } + case _ => + response.setIsSucceed(false); + response + } + + } + + override def createAsyncResponseRef(requestRef: ExecutionRequestRef, action: RefExecutionAction): AsyncExecutionResponseRef = { + action match { + case action: EventCheckerExecutionAction => { + val response = super.createAsyncResponseRef(requestRef,action) + response.setAction(action) + response.setMaxLoopTime(action.ec.maxWaitTime) + response.setAskStatePeriod(action.ec.queryFrequency) + response + } + } + } + + override def setDevelopmentService(service: DevelopmentService): Unit = { + this.service = service + } + +} diff --git a/dss-appconn/appconns/dss-orchestrator-framework-appconn/pom.xml b/dss-appconn/appconns/dss-orchestrator-framework-appconn/pom.xml new file mode 100644 index 000000000..045016f3e --- /dev/null +++ b/dss-appconn/appconns/dss-orchestrator-framework-appconn/pom.xml @@ -0,0 +1,142 @@ + + + + + + dss + com.webank.wedatasphere.dss + 1.0.0 + + 4.0.0 + + dss-orchestrator-framework-appconn + + + + + + com.webank.wedatasphere.dss + dss-orchestrator-core + ${dss.version} + + + com.webank.wedatasphere.linkis + linkis-module + + + com.webank.wedatasphere.linkis + linkis-rpc + + + com.webank.wedatasphere.dss + dss-contextservice + + + linkis-common + com.webank.wedatasphere.linkis + + + json4s-jackson_2.11 + org.json4s + + + + + + com.webank.wedatasphere.linkis + linkis-rpc + ${linkis.version} + provided + + + com.webank.wedatasphere.dss + dss-orchestrator-common + ${dss.version} + compile + + + com.webank.wedatasphere.dss + dss-workflow-common + ${dss.version} + compile + + + + com.webank.wedatasphere.dss + dss-common + ${dss.version} + provided + + + com.webank.wedatasphere.dss + dss-sender-service + ${dss.version} + provided + + + + + + + + org.apache.maven.plugins + maven-deploy-plugin + + + net.alchim31.maven + scala-maven-plugin + + + org.apache.maven.plugins + maven-jar-plugin + + + org.apache.maven.plugins + maven-assembly-plugin + 2.3 + false + + + make-assembly + package + + single + + + + src/main/assembly/distribution.xml + + + + + + false + out + false + false + + src/main/assembly/distribution.xml + + + + + + + + \ No newline at end of file diff --git a/dss-appconn/appconns/dss-orchestrator-framework-appconn/src/main/assembly/distribution.xml b/dss-appconn/appconns/dss-orchestrator-framework-appconn/src/main/assembly/distribution.xml new file mode 100644 index 000000000..dd3313ba8 --- /dev/null +++ b/dss-appconn/appconns/dss-orchestrator-framework-appconn/src/main/assembly/distribution.xml @@ -0,0 +1,66 @@ + + + + dss-orchestrator-framework-appconn + + dir + + true + orchestrator-framework + + + + + + lib + true + true + false + true + true + + + + + + ${basedir}/src/main/resources + + appconn.properties + + 0777 + / + unix + + + + ${basedir}/src/main/resources + + log4j.properties + log4j2.xml + + 0777 + conf + unix + + + + + + diff --git a/dss-appconn/appconns/dss-orchestrator-framework-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/orchestrator/DefaultOrchestratorFrameworkAppConn.java b/dss-appconn/appconns/dss-orchestrator-framework-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/orchestrator/DefaultOrchestratorFrameworkAppConn.java new file mode 100644 index 000000000..10c123354 --- /dev/null +++ b/dss-appconn/appconns/dss-orchestrator-framework-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/orchestrator/DefaultOrchestratorFrameworkAppConn.java @@ -0,0 +1,66 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.orchestrator; + +import com.webank.wedatasphere.dss.appconn.core.ext.OnlyDevelopmentAppConn; +import com.webank.wedatasphere.dss.appconn.core.impl.AbstractAppConn; +import com.webank.wedatasphere.dss.appconn.orchestrator.standard.OrchestratorFrameworkStandard; +import com.webank.wedatasphere.dss.orchestrator.common.ref.OrchestratorFrameworkAppConn; +import com.webank.wedatasphere.dss.standard.app.development.standard.DevelopmentIntegrationStandard; +import com.webank.wedatasphere.dss.standard.common.core.AppStandard; +import com.webank.wedatasphere.dss.standard.common.desc.AppDesc; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; + +import java.util.ArrayList; +import java.util.List; + +public class DefaultOrchestratorFrameworkAppConn extends AbstractAppConn implements OrchestratorFrameworkAppConn, OnlyDevelopmentAppConn { + + private static final Logger LOGGER = LoggerFactory.getLogger(DefaultOrchestratorFrameworkAppConn.class); + + private OrchestratorFrameworkStandard orchestratorFrameworkStandard; + private AppDesc appDesc; + + private final List standards = new ArrayList<>(); + + + @Override + public void setAppDesc(AppDesc appDesc) { + this.appDesc = appDesc; + } + + @Override + public List getAppStandards() { + return this.standards; + } + + @Override + protected void initialize() { + orchestratorFrameworkStandard = OrchestratorFrameworkStandard.getInstance(); + } + + @Override + public AppDesc getAppDesc() { + return this.appDesc; + } + + @Override + public DevelopmentIntegrationStandard getOrCreateDevelopmentStandard() { + return OrchestratorFrameworkStandard.getInstance(); + } +} diff --git a/dss-appconn/appconns/dss-orchestrator-framework-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/orchestrator/conf/OrchestratorConf.java b/dss-appconn/appconns/dss-orchestrator-framework-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/orchestrator/conf/OrchestratorConf.java new file mode 100644 index 000000000..a5d2b6098 --- /dev/null +++ b/dss-appconn/appconns/dss-orchestrator-framework-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/orchestrator/conf/OrchestratorConf.java @@ -0,0 +1,21 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.orchestrator.conf; + +public class OrchestratorConf { + +} diff --git a/dss-appconn/appconns/dss-orchestrator-framework-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/orchestrator/operation/OrchestratorFrameworkCreationOperation.java b/dss-appconn/appconns/dss-orchestrator-framework-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/orchestrator/operation/OrchestratorFrameworkCreationOperation.java new file mode 100644 index 000000000..6eb10fee6 --- /dev/null +++ b/dss-appconn/appconns/dss-orchestrator-framework-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/orchestrator/operation/OrchestratorFrameworkCreationOperation.java @@ -0,0 +1,73 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.orchestrator.operation; + +import com.webank.wedatasphere.dss.appconn.orchestrator.ref.DefaultOrchestratorCreateResponseRef; +import com.webank.wedatasphere.dss.common.protocol.ResponseCreateOrchestrator; +import com.webank.wedatasphere.dss.common.utils.DSSExceptionUtils; +import com.webank.wedatasphere.dss.orchestrator.common.protocol.RequestCreateOrchestrator; +import com.webank.wedatasphere.dss.orchestrator.common.ref.OrchestratorCreateRequestRef; +import com.webank.wedatasphere.dss.orchestrator.common.ref.OrchestratorCreateResponseRef; +import com.webank.wedatasphere.dss.sender.service.DSSSenderServiceFactory; +import com.webank.wedatasphere.dss.standard.app.development.operation.RefCreationOperation; +import com.webank.wedatasphere.dss.standard.app.development.service.DevelopmentService; +import com.webank.wedatasphere.dss.standard.common.exception.operation.ExternalOperationFailedException; +import com.webank.wedatasphere.linkis.rpc.Sender; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; + +public class OrchestratorFrameworkCreationOperation implements + RefCreationOperation { + private static final Logger LOGGER = LoggerFactory.getLogger(OrchestratorFrameworkCreationOperation.class); + + private final Sender sender = DSSSenderServiceFactory.getOrCreateServiceInstance().getOrcSender(); + private DevelopmentService developmentService; + + @Override + public void setDevelopmentService(DevelopmentService service) { + this.developmentService = service; + } + + @Override + public OrchestratorCreateResponseRef createRef(OrchestratorCreateRequestRef requestRef) throws ExternalOperationFailedException { + if (null == requestRef) { + LOGGER.error("requestRef is null can not create Ref"); + return null; + } + RequestCreateOrchestrator createRequest = new RequestCreateOrchestrator(requestRef.getUserName(), + requestRef.getWorkspaceName(), requestRef.getProjectName(), + requestRef.getProjectId(), requestRef.getDSSOrchestratorInfo().getDesc(), + requestRef.getDSSOrchestratorInfo(), requestRef.getDSSLabels()); + ResponseCreateOrchestrator createResponse = null; + try { + createResponse = (ResponseCreateOrchestrator) sender.ask(createRequest); + } catch (Exception e) { + DSSExceptionUtils.dealErrorException(60015, "create orchestrator ref failed", e, + ExternalOperationFailedException.class); + } + if (createResponse == null) { + LOGGER.error("createResponse is null, can not get correct response"); + return null; + } + LOGGER.info("End to ask to create orchestrator, orcId is {} orcVersionId is {}", + createResponse.orchestratorId(), createResponse.orchestratorVersionId()); + DefaultOrchestratorCreateResponseRef createResponseRef = new DefaultOrchestratorCreateResponseRef(); + createResponseRef.setOrcId(createResponse.orchestratorId()); + createResponseRef.setOrchestratorVersionId(createResponse.orchestratorVersionId()); + return createResponseRef; + } +} diff --git a/dss-appconn/appconns/dss-orchestrator-framework-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/orchestrator/operation/OrchestratorFrameworkDeleteOperation.java b/dss-appconn/appconns/dss-orchestrator-framework-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/orchestrator/operation/OrchestratorFrameworkDeleteOperation.java new file mode 100644 index 000000000..30978e5a8 --- /dev/null +++ b/dss-appconn/appconns/dss-orchestrator-framework-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/orchestrator/operation/OrchestratorFrameworkDeleteOperation.java @@ -0,0 +1,60 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.orchestrator.operation; + +import com.webank.wedatasphere.dss.common.protocol.JobStatus; +import com.webank.wedatasphere.dss.common.utils.DSSExceptionUtils; +import com.webank.wedatasphere.dss.orchestrator.common.protocol.RequestDeleteOrchestrator; +import com.webank.wedatasphere.dss.orchestrator.common.protocol.ResponseOperateOrchestrator; +import com.webank.wedatasphere.dss.orchestrator.common.ref.OrchestratorDeleteRequestRef; +import com.webank.wedatasphere.dss.sender.service.DSSSenderServiceFactory; +import com.webank.wedatasphere.dss.standard.app.development.operation.RefDeletionOperation; +import com.webank.wedatasphere.dss.standard.app.development.service.DevelopmentService; +import com.webank.wedatasphere.dss.standard.common.exception.operation.ExternalOperationFailedException; +import com.webank.wedatasphere.linkis.rpc.Sender; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; + +public class OrchestratorFrameworkDeleteOperation implements + RefDeletionOperation { + private static final Logger LOGGER = LoggerFactory.getLogger(OrchestratorFrameworkDeleteOperation.class); + + @Override + public void setDevelopmentService(DevelopmentService service) { + } + + @Override + public void deleteRef(OrchestratorDeleteRequestRef requestRef) throws ExternalOperationFailedException { + LOGGER.info("Begin to ask to delete orchestrator, requestRef is {}", requestRef); + RequestDeleteOrchestrator deleteRequest = new RequestDeleteOrchestrator(requestRef.getUserName(), + requestRef.getWorkspaceName(), requestRef.getProjectName(), + requestRef.getOrcId(), requestRef.getDSSLabels()); + ResponseOperateOrchestrator deleteResponse = null; + try { + Sender sender = DSSSenderServiceFactory.getOrCreateServiceInstance().getOrcSender(requestRef.getDSSLabels()); + deleteResponse = (ResponseOperateOrchestrator) sender.ask(deleteRequest); + LOGGER.info("End to ask to delete orchestrator , responseRef is {}", deleteResponse); + if (deleteResponse == null || !deleteResponse.getJobStatus().equals(JobStatus.Success)){ + LOGGER.error("delete response is null or delete response status is not success"); + DSSExceptionUtils.dealErrorException(60075, "failed to delete ref", ExternalOperationFailedException.class); + } + } catch (Exception e) { + DSSExceptionUtils.dealErrorException(60015, "delete orchestrator ref failed", ExternalOperationFailedException.class); + } + } + +} \ No newline at end of file diff --git a/dss-appconn/appconns/dss-orchestrator-framework-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/orchestrator/operation/OrchestratorFrameworkExportOperation.java b/dss-appconn/appconns/dss-orchestrator-framework-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/orchestrator/operation/OrchestratorFrameworkExportOperation.java new file mode 100644 index 000000000..d0c0fc29c --- /dev/null +++ b/dss-appconn/appconns/dss-orchestrator-framework-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/orchestrator/operation/OrchestratorFrameworkExportOperation.java @@ -0,0 +1,75 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.orchestrator.operation; + +import com.webank.wedatasphere.dss.appconn.orchestrator.ref.DefaultOrchestratorExportResponseRef; +import com.webank.wedatasphere.dss.common.protocol.ResponseExportOrchestrator; +import com.webank.wedatasphere.dss.common.utils.DSSCommonUtils; +import com.webank.wedatasphere.dss.common.utils.DSSExceptionUtils; +import com.webank.wedatasphere.dss.orchestrator.common.protocol.RequestExportOrchestrator; +import com.webank.wedatasphere.dss.orchestrator.common.ref.OrchestratorExportRequestRef; +import com.webank.wedatasphere.dss.orchestrator.common.ref.OrchestratorExportResponseRef; +import com.webank.wedatasphere.dss.sender.service.DSSSenderServiceFactory; +import com.webank.wedatasphere.dss.standard.app.development.service.DevelopmentService; +import com.webank.wedatasphere.dss.standard.app.development.operation.RefExportOperation; +import com.webank.wedatasphere.dss.standard.common.exception.operation.ExternalOperationFailedException; +import com.webank.wedatasphere.linkis.rpc.Sender; +import com.webank.wedatasphere.linkis.server.BDPJettyServerHelper; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; + +public class OrchestratorFrameworkExportOperation implements RefExportOperation { + private static final Logger LOGGER = LoggerFactory.getLogger(OrchestratorFrameworkExportOperation.class); + + private final Sender sender = DSSSenderServiceFactory.getOrCreateServiceInstance().getOrcSender(); + private DevelopmentService service; + + @Override + public OrchestratorExportResponseRef exportRef(OrchestratorExportRequestRef requestRef) throws ExternalOperationFailedException { + if (null == requestRef){ + LOGGER.error("request ref for exporting is null, it is a fatal error"); + return null; + } + LOGGER.info("Begin to ask to export orchestrator, requestRef is {}", DSSCommonUtils.COMMON_GSON.toJson(requestRef)); + RequestExportOrchestrator exportRequest = new RequestExportOrchestrator(requestRef.getUserName(), + requestRef.getWorkspaceName(), requestRef.getOrcId(), -1L, + requestRef.getProjectName(), requestRef.getDSSLabels(), requestRef.getAddOrcVersionFlag(), + BDPJettyServerHelper.gson().toJson(requestRef.getWorkspace())); + ResponseExportOrchestrator exportResponse = null; + try{ + exportResponse = (ResponseExportOrchestrator) sender.ask(exportRequest); + }catch(final Throwable e){ + DSSExceptionUtils.dealErrorException(60015, "export orchestrator ref failed", e, + ExternalOperationFailedException.class); + } + LOGGER.info("End to ask to export orchestrator, responseRef is {}", DSSCommonUtils.COMMON_GSON.toJson(exportResponse)); + if(exportResponse == null){ + LOGGER.error("exportResponse is null, it means export is failed"); + DSSExceptionUtils.dealErrorException(63323, "exportResponse is null, it means export is failed", ExternalOperationFailedException.class); + } + DefaultOrchestratorExportResponseRef exportResponseRef = new DefaultOrchestratorExportResponseRef(); + exportResponseRef.setBmlVersion(exportResponse.version()); + exportResponseRef.setResourceId(exportResponse.resourceId()); + exportResponseRef.setOrchestratorVersionId(exportResponse.orcVersionId()); + return exportResponseRef; + } + + @Override + public void setDevelopmentService(DevelopmentService service) { + this.service = service; + } +} diff --git a/dss-appconn/appconns/dss-orchestrator-framework-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/orchestrator/operation/OrchestratorFrameworkImportOperation.java b/dss-appconn/appconns/dss-orchestrator-framework-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/orchestrator/operation/OrchestratorFrameworkImportOperation.java new file mode 100644 index 000000000..c18b0732a --- /dev/null +++ b/dss-appconn/appconns/dss-orchestrator-framework-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/orchestrator/operation/OrchestratorFrameworkImportOperation.java @@ -0,0 +1,71 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.orchestrator.operation; + +import com.webank.wedatasphere.dss.common.protocol.ResponseImportOrchestrator; +import com.webank.wedatasphere.dss.common.utils.DSSCommonUtils; +import com.webank.wedatasphere.dss.common.utils.DSSExceptionUtils; +import com.webank.wedatasphere.dss.orchestrator.common.protocol.RequestImportOrchestrator; +import com.webank.wedatasphere.dss.orchestrator.common.ref.OrchestratorImportRequestRef; +import com.webank.wedatasphere.dss.sender.service.DSSSenderServiceFactory; +import com.webank.wedatasphere.dss.standard.app.development.service.DevelopmentService; +import com.webank.wedatasphere.dss.standard.app.development.operation.RefImportOperation; +import com.webank.wedatasphere.dss.standard.app.development.ref.CommonResponseRef; +import com.webank.wedatasphere.dss.standard.common.exception.operation.ExternalOperationFailedException; +import com.webank.wedatasphere.linkis.rpc.Sender; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; + +public class OrchestratorFrameworkImportOperation implements + RefImportOperation { + private static final Logger LOGGER = LoggerFactory.getLogger(OrchestratorFrameworkImportOperation.class); + private DevelopmentService service; + + @Override + public CommonResponseRef importRef(OrchestratorImportRequestRef requestRef) throws ExternalOperationFailedException { + if (null == requestRef) { + return null; + } + LOGGER.info("Begin to ask to import orchestrator, requestRef is {}", DSSCommonUtils.COMMON_GSON.toJson(requestRef)); + RequestImportOrchestrator importRequest = new RequestImportOrchestrator(requestRef.getUserName(), + requestRef.getWorkspaceName(), requestRef.getProjectName(), + requestRef.getProjectId(), requestRef.getResourceId(), + requestRef.getBmlVersion(), requestRef.getOrcName(), requestRef.getDSSLabels(), + DSSCommonUtils.COMMON_GSON.toJson(requestRef.getWorkspace())); + ResponseImportOrchestrator importResponse = null; + try { + Sender sender = DSSSenderServiceFactory.getOrCreateServiceInstance().getOrcSender(requestRef.getDSSLabels()); + importResponse = (ResponseImportOrchestrator) sender.ask(importRequest); + } catch (final Throwable t) { + DSSExceptionUtils.dealErrorException(60015, "import orchestrator ref failed", t, + ExternalOperationFailedException.class); + } + LOGGER.info("End to ask to import orchestrator, responseRef is {}", DSSCommonUtils.COMMON_GSON.toJson(importResponse)); + CommonResponseRef importResponseRef = new CommonResponseRef(); + if (null == importResponse){ + LOGGER.error("importResponse is null it means failed to import Ref"); + DSSExceptionUtils.dealErrorException(60015, "import ref response is null", ExternalOperationFailedException.class); + } + importResponseRef.setOrcId(importResponse.orcId()); + return importResponseRef; + } + + @Override + public void setDevelopmentService(DevelopmentService service) { + this.service = service; + } +} diff --git a/dss-appconn/appconns/dss-orchestrator-framework-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/orchestrator/operation/OrchestratorFrameworkQueryOperation.java b/dss-appconn/appconns/dss-orchestrator-framework-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/orchestrator/operation/OrchestratorFrameworkQueryOperation.java new file mode 100644 index 000000000..4dc75bb62 --- /dev/null +++ b/dss-appconn/appconns/dss-orchestrator-framework-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/orchestrator/operation/OrchestratorFrameworkQueryOperation.java @@ -0,0 +1,68 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.orchestrator.operation; + +import com.webank.wedatasphere.dss.appconn.orchestrator.ref.DefaultOrchestratorQueryResponseRef; +import com.webank.wedatasphere.dss.common.utils.DSSExceptionUtils; +import com.webank.wedatasphere.dss.orchestrator.common.protocol.RequestQueryOrchestrator; +import com.webank.wedatasphere.dss.orchestrator.common.protocol.ResponseQueryOrchestrator; +import com.webank.wedatasphere.dss.orchestrator.common.ref.OrchestratorQueryRequestRef; +import com.webank.wedatasphere.dss.orchestrator.common.ref.OrchestratorQueryResponseRef; +import com.webank.wedatasphere.dss.sender.service.DSSSenderServiceFactory; +import com.webank.wedatasphere.dss.standard.app.development.service.DevelopmentService; +import com.webank.wedatasphere.dss.standard.app.development.operation.RefQueryOperation; +import com.webank.wedatasphere.dss.standard.common.exception.operation.ExternalOperationFailedException; +import com.webank.wedatasphere.linkis.rpc.Sender; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; + +public class OrchestratorFrameworkQueryOperation implements RefQueryOperation { + private static final Logger LOGGER = LoggerFactory.getLogger(OrchestratorFrameworkQueryOperation.class); + + private final Sender sender = DSSSenderServiceFactory.getOrCreateServiceInstance().getOrcSender(); + private DevelopmentService developmentService; + + @Override + public OrchestratorQueryResponseRef query(OrchestratorQueryRequestRef requestRef) throws ExternalOperationFailedException { + if (null == requestRef) { + LOGGER.error("request of query is null"); + return null; + } + LOGGER.info("Begin to ask to create orchestrator, requestRef is {}", requestRef); + RequestQueryOrchestrator queryRequest = new RequestQueryOrchestrator(requestRef.getOrchestratorIdList()); + ResponseQueryOrchestrator queryResponse = null; + try { + queryResponse = (ResponseQueryOrchestrator) sender.ask(queryRequest); + } catch (Exception e) { + DSSExceptionUtils.dealErrorException(60015, "create orchestrator ref failed", + ExternalOperationFailedException.class); + } + if (queryResponse == null) { + LOGGER.error("query response is null, it is a fatal error"); + return null; + } + LOGGER.info("End to ask to query orchestrator, responseRef is {}", queryResponse); + OrchestratorQueryResponseRef queryResponseRef = new DefaultOrchestratorQueryResponseRef(); + queryResponseRef.setOrchestratorVoList(queryResponse.getOrchestratorVoes()); + return queryResponseRef; + } + + @Override + public void setDevelopmentService(DevelopmentService service) { + this.developmentService = service; + } +} diff --git a/dss-appconn/appconns/dss-orchestrator-framework-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/orchestrator/operation/OrchestratorFrameworkUpdateOperation.java b/dss-appconn/appconns/dss-orchestrator-framework-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/orchestrator/operation/OrchestratorFrameworkUpdateOperation.java new file mode 100644 index 000000000..a69671bd1 --- /dev/null +++ b/dss-appconn/appconns/dss-orchestrator-framework-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/orchestrator/operation/OrchestratorFrameworkUpdateOperation.java @@ -0,0 +1,67 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.orchestrator.operation; + +import com.webank.wedatasphere.dss.common.protocol.JobStatus; +import com.webank.wedatasphere.dss.common.utils.DSSExceptionUtils; +import com.webank.wedatasphere.dss.orchestrator.common.protocol.RequestUpdateOrchestrator; +import com.webank.wedatasphere.dss.orchestrator.common.protocol.ResponseOperateOrchestrator; +import com.webank.wedatasphere.dss.orchestrator.common.ref.OrchestratorUpdateRef; +import com.webank.wedatasphere.dss.sender.service.DSSSenderServiceFactory; +import com.webank.wedatasphere.dss.standard.app.development.operation.RefUpdateOperation; +import com.webank.wedatasphere.dss.standard.app.development.ref.CommonResponseRef; +import com.webank.wedatasphere.dss.standard.app.development.service.DevelopmentService; +import com.webank.wedatasphere.dss.standard.common.exception.operation.ExternalOperationFailedException; +import com.webank.wedatasphere.linkis.rpc.Sender; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; + +public class OrchestratorFrameworkUpdateOperation implements + RefUpdateOperation { + private static final Logger LOGGER = LoggerFactory.getLogger(OrchestratorFrameworkUpdateOperation.class); + + @Override + public CommonResponseRef updateRef(OrchestratorUpdateRef requestRef) throws ExternalOperationFailedException { + if (null == requestRef){ + LOGGER.error("request ref is null, can not deal with null ref"); + return null; + } + LOGGER.info("Begin to ask to update orchestrator, requestRef is {}", requestRef); + RequestUpdateOrchestrator updateRequest = new RequestUpdateOrchestrator(requestRef.getUserName(), + requestRef.getWorkspaceName(), requestRef.getOrchestratorInfo(), requestRef.getDSSLabels()); + ResponseOperateOrchestrator updateResponse = null; + Sender sender = DSSSenderServiceFactory.getOrCreateServiceInstance().getOrcSender(requestRef.getDSSLabels()); + try{ + updateResponse = (ResponseOperateOrchestrator) sender.ask(updateRequest); + }catch(final Exception e){ + DSSExceptionUtils.dealErrorException(60015, "update orchestrator ref failed", e, + ExternalOperationFailedException.class); + } + LOGGER.info("End to ask to update orchestrator, responseRef is {}", updateResponse); + if (updateResponse == null) { + LOGGER.error("to get updateResponse from orchestrator is null"); + return null; + } + CommonResponseRef updateResponseRef = new CommonResponseRef(); + updateResponseRef.setResult(JobStatus.Success.equals(updateResponse.getJobStatus())); + return updateResponseRef; + } + + @Override + public void setDevelopmentService(DevelopmentService service) { + } +} diff --git a/dss-appconn/appconns/dss-orchestrator-framework-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/orchestrator/ref/DefaultOrchestratorCreateResponseRef.java b/dss-appconn/appconns/dss-orchestrator-framework-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/orchestrator/ref/DefaultOrchestratorCreateResponseRef.java new file mode 100644 index 000000000..2ee515573 --- /dev/null +++ b/dss-appconn/appconns/dss-orchestrator-framework-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/orchestrator/ref/DefaultOrchestratorCreateResponseRef.java @@ -0,0 +1,40 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.orchestrator.ref; + +import com.webank.wedatasphere.dss.orchestrator.common.ref.OrchestratorCreateResponseRef; +import com.webank.wedatasphere.dss.standard.app.development.ref.CommonResponseRef; + +public class DefaultOrchestratorCreateResponseRef extends CommonResponseRef implements OrchestratorCreateResponseRef { + + + private Long versionId; + + + @Override + public Long getOrchestratorVersionId() { + return versionId; + } + + @Override + public void setOrchestratorVersionId(Long versionId) { + this.versionId = versionId; + } + + + +} diff --git a/dss-appconn/appconns/dss-orchestrator-framework-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/orchestrator/ref/DefaultOrchestratorDeleteRequestRef.java b/dss-appconn/appconns/dss-orchestrator-framework-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/orchestrator/ref/DefaultOrchestratorDeleteRequestRef.java new file mode 100644 index 000000000..5d8f1dfe1 --- /dev/null +++ b/dss-appconn/appconns/dss-orchestrator-framework-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/orchestrator/ref/DefaultOrchestratorDeleteRequestRef.java @@ -0,0 +1,36 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.orchestrator.ref; + +import com.webank.wedatasphere.dss.orchestrator.common.ref.OrchestratorDeleteRequestRef; +import com.webank.wedatasphere.dss.standard.app.development.ref.impl.CommonRequestRefImpl; + +public class DefaultOrchestratorDeleteRequestRef extends CommonRequestRefImpl implements OrchestratorDeleteRequestRef { + + private Long appId; + + @Override + public void setAppId(Long appId) { + this.appId = appId; + } + + @Override + public Long getAppId() { + return this.appId; + } + +} diff --git a/dss-appconn/appconns/dss-orchestrator-framework-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/orchestrator/ref/DefaultOrchestratorExportRequestRef.java b/dss-appconn/appconns/dss-orchestrator-framework-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/orchestrator/ref/DefaultOrchestratorExportRequestRef.java new file mode 100644 index 000000000..937c73480 --- /dev/null +++ b/dss-appconn/appconns/dss-orchestrator-framework-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/orchestrator/ref/DefaultOrchestratorExportRequestRef.java @@ -0,0 +1,65 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.orchestrator.ref; + + +import com.webank.wedatasphere.dss.orchestrator.common.ref.OrchestratorExportRequestRef; +import com.webank.wedatasphere.dss.standard.app.development.ref.impl.CommonRequestRefImpl; + +public class DefaultOrchestratorExportRequestRef extends CommonRequestRefImpl implements OrchestratorExportRequestRef { + + + private Long appId; + + private Long orchestratorVersionId; + + private boolean addOrcVersion; + + + @Override + public void setAppId(Long appId) { + this.appId = appId; + } + + @Override + public Long getAppId() { + return appId; + } + + @Override + public void setOrchestratorVersionId(Long orchestratorVersionId) { + this.orchestratorVersionId = orchestratorVersionId; + } + + + @Override + public Long getOrchestratorVersionId() { + return orchestratorVersionId; + } + + + @Override + public boolean getAddOrcVersionFlag() { + return addOrcVersion; + } + + @Override + public void setAddOrcVersionFlag(boolean addOrcVersion) { + this.addOrcVersion = addOrcVersion; + } + +} diff --git a/dss-appconn/appconns/dss-orchestrator-framework-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/orchestrator/ref/DefaultOrchestratorExportResponseRef.java b/dss-appconn/appconns/dss-orchestrator-framework-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/orchestrator/ref/DefaultOrchestratorExportResponseRef.java new file mode 100644 index 000000000..3464e0af6 --- /dev/null +++ b/dss-appconn/appconns/dss-orchestrator-framework-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/orchestrator/ref/DefaultOrchestratorExportResponseRef.java @@ -0,0 +1,59 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.orchestrator.ref; + +import com.webank.wedatasphere.dss.orchestrator.common.ref.OrchestratorExportResponseRef; +import com.webank.wedatasphere.dss.standard.app.development.ref.CommonResponseRef; + + +public class DefaultOrchestratorExportResponseRef extends CommonResponseRef implements OrchestratorExportResponseRef { + + private String resourceId; + + private String bmlVersion; + + private Long orchestratorVersionId; + @Override + public String getResourceId() { + return resourceId; + } + + @Override + public void setResourceId(String resourceId) { + this.resourceId = resourceId; + } + + @Override + public String getBmlVersion() { + return bmlVersion; + } + + @Override + public void setBmlVersion(String bmlVersion) { + this.bmlVersion = bmlVersion; + } + + @Override + public Long getOrchestratorVersionId() { + return orchestratorVersionId; + } + + @Override + public void setOrchestratorVersionId(Long orchestratorVersionId) { + this.orchestratorVersionId = orchestratorVersionId; + } +} diff --git a/dss-appconn/appconns/dss-orchestrator-framework-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/orchestrator/ref/DefaultOrchestratorImportRequestRef.java b/dss-appconn/appconns/dss-orchestrator-framework-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/orchestrator/ref/DefaultOrchestratorImportRequestRef.java new file mode 100644 index 000000000..ff65b14c1 --- /dev/null +++ b/dss-appconn/appconns/dss-orchestrator-framework-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/orchestrator/ref/DefaultOrchestratorImportRequestRef.java @@ -0,0 +1,70 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.orchestrator.ref; + +import com.webank.wedatasphere.dss.common.entity.IOEnv; +import com.webank.wedatasphere.dss.orchestrator.common.ref.OrchestratorImportRequestRef; +import com.webank.wedatasphere.dss.standard.app.development.ref.impl.CommonRequestRefImpl; + + +public class DefaultOrchestratorImportRequestRef extends CommonRequestRefImpl implements OrchestratorImportRequestRef { + private String resourceId; + private String bmlVersion; + private IOEnv sourceEnv; + private String orcVersion; + + @Override + public void setResourceId(String resourceId) { + this.resourceId = resourceId; + } + + @Override + public String getResourceId() { + return resourceId; + } + + @Override + public void setBmlVersion(String bmlVersion) { + this.bmlVersion = bmlVersion; + } + + @Override + public String getBmlVersion() { + return bmlVersion; + } + @Override + public void setSourceEnv(IOEnv sourceEnv) { + this.sourceEnv = sourceEnv; + } + + @Override + public IOEnv getSourceEnv() { + return sourceEnv; + } + + @Override + public void setOrcVersion(String orcVersion) { + this.orcVersion = orcVersion; + } + + @Override + public String getOrcVersion() { + return orcVersion; + } + + +} diff --git a/dss-appconn/appconns/dss-orchestrator-framework-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/orchestrator/ref/DefaultOrchestratorQueryRequestRef.java b/dss-appconn/appconns/dss-orchestrator-framework-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/orchestrator/ref/DefaultOrchestratorQueryRequestRef.java new file mode 100644 index 000000000..164f2f8d6 --- /dev/null +++ b/dss-appconn/appconns/dss-orchestrator-framework-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/orchestrator/ref/DefaultOrchestratorQueryRequestRef.java @@ -0,0 +1,39 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.orchestrator.ref; + +import com.webank.wedatasphere.dss.orchestrator.common.ref.OrchestratorQueryRequestRef; +import com.webank.wedatasphere.dss.standard.app.development.ref.impl.CommonRequestRefImpl; + +import java.util.List; + + +public class DefaultOrchestratorQueryRequestRef extends CommonRequestRefImpl implements OrchestratorQueryRequestRef { + + private List orcIdList; + + @Override + public List getOrchestratorIdList() { + return orcIdList; + } + + @Override + public void setOrchestratorIdList(List orchestratorIdList) { + this.orcIdList = orchestratorIdList; + } + +} diff --git a/dss-appconn/appconns/dss-orchestrator-framework-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/orchestrator/ref/DefaultOrchestratorQueryResponseRef.java b/dss-appconn/appconns/dss-orchestrator-framework-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/orchestrator/ref/DefaultOrchestratorQueryResponseRef.java new file mode 100644 index 000000000..bd7cead0b --- /dev/null +++ b/dss-appconn/appconns/dss-orchestrator-framework-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/orchestrator/ref/DefaultOrchestratorQueryResponseRef.java @@ -0,0 +1,38 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.orchestrator.ref; + +import com.webank.wedatasphere.dss.orchestrator.common.entity.OrchestratorVo; +import com.webank.wedatasphere.dss.orchestrator.common.ref.OrchestratorQueryResponseRef; +import com.webank.wedatasphere.dss.standard.app.development.ref.CommonResponseRef; + +import java.util.List; + +public class DefaultOrchestratorQueryResponseRef extends CommonResponseRef implements OrchestratorQueryResponseRef { + private List orchestratorVos; + + @Override + public List getOrchestratorVos() { + return orchestratorVos; + } + + @Override + public void setOrchestratorVoList(List orchestratorVoList) { + this.orchestratorVos = orchestratorVoList; + } + +} diff --git a/dss-appconn/appconns/dss-orchestrator-framework-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/orchestrator/ref/DefaultOrchestratorUpdateRequestRef.java b/dss-appconn/appconns/dss-orchestrator-framework-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/orchestrator/ref/DefaultOrchestratorUpdateRequestRef.java new file mode 100644 index 000000000..b2f903fd8 --- /dev/null +++ b/dss-appconn/appconns/dss-orchestrator-framework-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/orchestrator/ref/DefaultOrchestratorUpdateRequestRef.java @@ -0,0 +1,58 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.orchestrator.ref; + +import com.webank.wedatasphere.dss.orchestrator.common.entity.DSSOrchestratorInfo; +import com.webank.wedatasphere.dss.orchestrator.common.ref.OrchestratorUpdateRef; +import com.webank.wedatasphere.dss.standard.app.development.ref.impl.CommonRequestRefImpl; + +public class DefaultOrchestratorUpdateRequestRef extends CommonRequestRefImpl implements OrchestratorUpdateRef { + + private String description; + private String uses; + + private DSSOrchestratorInfo dssOrchestratorInfo; + + @Override + public void setDescription(String description) { + this.description = description; + } + + @Override + public String getDescription() { + return description; + } + + @Override + public void setUses(String uses) { + this.uses = uses; + } + + @Override + public String getUses() { + return uses; + } + + @Override + public DSSOrchestratorInfo getOrchestratorInfo() { + return null; + } + + @Override + public void setOrchestratorInfo(DSSOrchestratorInfo dssOrchestratorInfo) { + this.dssOrchestratorInfo = dssOrchestratorInfo; } +} diff --git a/dss-appconn/appconns/dss-orchestrator-framework-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/orchestrator/service/OrchestratorCRUDService.java b/dss-appconn/appconns/dss-orchestrator-framework-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/orchestrator/service/OrchestratorCRUDService.java new file mode 100644 index 000000000..920a2706b --- /dev/null +++ b/dss-appconn/appconns/dss-orchestrator-framework-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/orchestrator/service/OrchestratorCRUDService.java @@ -0,0 +1,50 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.orchestrator.service; + +import com.webank.wedatasphere.dss.appconn.orchestrator.operation.OrchestratorFrameworkCreationOperation; +import com.webank.wedatasphere.dss.appconn.orchestrator.operation.OrchestratorFrameworkDeleteOperation; +import com.webank.wedatasphere.dss.appconn.orchestrator.operation.OrchestratorFrameworkUpdateOperation; +import com.webank.wedatasphere.dss.standard.app.development.operation.RefCopyOperation; +import com.webank.wedatasphere.dss.standard.app.development.operation.RefCreationOperation; +import com.webank.wedatasphere.dss.standard.app.development.operation.RefDeletionOperation; +import com.webank.wedatasphere.dss.standard.app.development.operation.RefUpdateOperation; +import com.webank.wedatasphere.dss.standard.app.development.service.AbstractRefCRUDService; + +public class OrchestratorCRUDService extends AbstractRefCRUDService { + + @Override + protected RefCreationOperation createRefCreationOperation() { + return new OrchestratorFrameworkCreationOperation(); + } + + @Override + protected RefCopyOperation createRefCopyOperation() { + return null; + } + + @Override + protected RefUpdateOperation createRefUpdateOperation() { + return new OrchestratorFrameworkUpdateOperation(); + } + + @Override + protected RefDeletionOperation createRefDeletionOperation() { + return new OrchestratorFrameworkDeleteOperation(); + } + +} diff --git a/dss-appconn/appconns/dss-orchestrator-framework-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/orchestrator/service/OrchestratorExportProcessService.java b/dss-appconn/appconns/dss-orchestrator-framework-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/orchestrator/service/OrchestratorExportProcessService.java new file mode 100644 index 000000000..2b0fb967e --- /dev/null +++ b/dss-appconn/appconns/dss-orchestrator-framework-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/orchestrator/service/OrchestratorExportProcessService.java @@ -0,0 +1,32 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.orchestrator.service; + +import com.webank.wedatasphere.dss.appconn.orchestrator.operation.OrchestratorFrameworkExportOperation; +import com.webank.wedatasphere.dss.orchestrator.common.ref.OrchestratorExportRequestRef; +import com.webank.wedatasphere.dss.standard.app.development.operation.RefExportOperation; +import com.webank.wedatasphere.dss.standard.app.development.service.AbstractRefExportService; + +public class OrchestratorExportProcessService extends AbstractRefExportService { + + @Override + @SuppressWarnings("unchecked") + public RefExportOperation createRefExportOperation() { + return new OrchestratorFrameworkExportOperation(); + } + +} diff --git a/dss-appconn/appconns/dss-orchestrator-framework-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/orchestrator/service/OrchestratorImportProcessService.java b/dss-appconn/appconns/dss-orchestrator-framework-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/orchestrator/service/OrchestratorImportProcessService.java new file mode 100644 index 000000000..88b364c2e --- /dev/null +++ b/dss-appconn/appconns/dss-orchestrator-framework-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/orchestrator/service/OrchestratorImportProcessService.java @@ -0,0 +1,30 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.orchestrator.service; + +import com.webank.wedatasphere.dss.appconn.orchestrator.operation.OrchestratorFrameworkImportOperation; +import com.webank.wedatasphere.dss.standard.app.development.service.AbstractRefImportService; +import com.webank.wedatasphere.dss.standard.app.development.operation.RefImportOperation; + +public class OrchestratorImportProcessService extends AbstractRefImportService { + + @Override + protected RefImportOperation createRefImportOperation() { + return new OrchestratorFrameworkImportOperation(); + } + +} diff --git a/dss-appconn/appconns/dss-orchestrator-framework-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/orchestrator/service/OrchestratorQueryService.java b/dss-appconn/appconns/dss-orchestrator-framework-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/orchestrator/service/OrchestratorQueryService.java new file mode 100644 index 000000000..11649efd3 --- /dev/null +++ b/dss-appconn/appconns/dss-orchestrator-framework-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/orchestrator/service/OrchestratorQueryService.java @@ -0,0 +1,32 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.orchestrator.service; + +import com.webank.wedatasphere.dss.appconn.orchestrator.operation.OrchestratorFrameworkQueryOperation; +import com.webank.wedatasphere.dss.standard.app.development.service.AbstractRefQueryService; +import com.webank.wedatasphere.dss.standard.app.development.operation.RefQueryOperation; + +public class OrchestratorQueryService extends AbstractRefQueryService { + + @Override + public RefQueryOperation createRefQueryOperation() { + OrchestratorFrameworkQueryOperation orchestratorFrameworkQueryOperation = new OrchestratorFrameworkQueryOperation(); + orchestratorFrameworkQueryOperation.setDevelopmentService(this); + return orchestratorFrameworkQueryOperation ; + } + +} diff --git a/dss-appconn/appconns/dss-orchestrator-framework-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/orchestrator/standard/OrchestratorFrameworkStandard.java b/dss-appconn/appconns/dss-orchestrator-framework-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/orchestrator/standard/OrchestratorFrameworkStandard.java new file mode 100644 index 000000000..dfd6072bc --- /dev/null +++ b/dss-appconn/appconns/dss-orchestrator-framework-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/orchestrator/standard/OrchestratorFrameworkStandard.java @@ -0,0 +1,73 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.orchestrator.standard; + +import com.webank.wedatasphere.dss.appconn.orchestrator.service.OrchestratorCRUDService; +import com.webank.wedatasphere.dss.appconn.orchestrator.service.OrchestratorExportProcessService; +import com.webank.wedatasphere.dss.appconn.orchestrator.service.OrchestratorImportProcessService; +import com.webank.wedatasphere.dss.appconn.orchestrator.service.OrchestratorQueryService; +import com.webank.wedatasphere.dss.standard.app.development.service.*; +import com.webank.wedatasphere.dss.standard.app.development.standard.AbstractDevelopmentIntegrationStandard; + +public class OrchestratorFrameworkStandard extends AbstractDevelopmentIntegrationStandard { + + private volatile static OrchestratorFrameworkStandard instance; + + public static OrchestratorFrameworkStandard getInstance(){ + if (instance == null){ + synchronized (OrchestratorFrameworkStandard.class){ + if (instance == null){ + instance = new OrchestratorFrameworkStandard(); + } + } + } + return instance; + } + + @Override + protected RefCRUDService createRefCRUDService() { + return new OrchestratorCRUDService(); + } + + @Override + protected RefExecutionService createRefExecutionService() { + return null; + } + + @Override + protected RefExportService createRefExportService() { + return new OrchestratorExportProcessService(); + } + + @Override + protected RefImportService createRefImportService() { + return new OrchestratorImportProcessService(); + } + + @Override + protected RefQueryService createRefQueryService() { + return new OrchestratorQueryService(); + } + + + @Override + public String getStandardName() { + return "OrchestratorFrameworkStandard"; + } + + +} diff --git a/dss-appconn/appconns/dss-orchestrator-framework-appconn/src/main/resources/appconn.properties b/dss-appconn/appconns/dss-orchestrator-framework-appconn/src/main/resources/appconn.properties new file mode 100644 index 000000000..19365e2b5 --- /dev/null +++ b/dss-appconn/appconns/dss-orchestrator-framework-appconn/src/main/resources/appconn.properties @@ -0,0 +1,20 @@ +# +# Copyright 2019 WeBank +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# +# + + + + + diff --git a/dss-appconn/appconns/dss-orchestrator-framework-appconn/src/main/resources/log4j.properties b/dss-appconn/appconns/dss-orchestrator-framework-appconn/src/main/resources/log4j.properties new file mode 100644 index 000000000..ee8619595 --- /dev/null +++ b/dss-appconn/appconns/dss-orchestrator-framework-appconn/src/main/resources/log4j.properties @@ -0,0 +1,36 @@ +# +# Copyright 2019 WeBank +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# +# + +### set log levels ### + +log4j.rootCategory=INFO,console + +log4j.appender.console=org.apache.log4j.ConsoleAppender +log4j.appender.console.Threshold=INFO +log4j.appender.console.layout=org.apache.log4j.PatternLayout +#log4j.appender.console.layout.ConversionPattern= %d{ISO8601} %-5p (%t) [%F:%M(%L)] - %m%n +log4j.appender.console.layout.ConversionPattern= %d{ISO8601} %-5p (%t) %p %c{1} - %m%n + + +log4j.appender.com.webank.bdp.ide.core=org.apache.log4j.DailyRollingFileAppender +log4j.appender.com.webank.bdp.ide.core.Threshold=INFO +log4j.additivity.com.webank.bdp.ide.core=false +log4j.appender.com.webank.bdp.ide.core.layout=org.apache.log4j.PatternLayout +log4j.appender.com.webank.bdp.ide.core.Append=true +log4j.appender.com.webank.bdp.ide.core.File=logs/linkis.log +log4j.appender.com.webank.bdp.ide.core.layout.ConversionPattern= %d{ISO8601} %-5p (%t) [%F:%M(%L)] - %m%n + +log4j.logger.org.springframework=INFO diff --git a/dss-appconn/appconns/dss-orchestrator-framework-appconn/src/main/resources/log4j2.xml b/dss-appconn/appconns/dss-orchestrator-framework-appconn/src/main/resources/log4j2.xml new file mode 100644 index 000000000..8c40a73e8 --- /dev/null +++ b/dss-appconn/appconns/dss-orchestrator-framework-appconn/src/main/resources/log4j2.xml @@ -0,0 +1,38 @@ + + + + + + + + + + + + + + + + + + + + + + + diff --git a/dss-appconn/appconns/dss-schedulis-appconn/pom.xml b/dss-appconn/appconns/dss-schedulis-appconn/pom.xml new file mode 100644 index 000000000..926ad110c --- /dev/null +++ b/dss-appconn/appconns/dss-schedulis-appconn/pom.xml @@ -0,0 +1,136 @@ + + + + + + dss + com.webank.wedatasphere.dss + 1.0.0 + ../../../pom.xml + + 4.0.0 + + dss-schedulis-appconn + + + + + + com.webank.wedatasphere.dss + dss-scheduler-appconn + ${dss.version} + + + + + com.webank.wedatasphere.dss + dss-origin-sso-integration-standard + ${dss.version} + + + linkis-common + com.webank.wedatasphere.linkis + + + json4s-jackson_2.11 + org.json4s + + + + + + + org.apache.httpcomponents + httpclient + 4.5.13 + + + + + com.google.code.gson + gson + ${gson.version} + provided + + + com.webank.wedatasphere.dss + dss-contextservice + ${dss.version} + provided + + + com.webank.wedatasphere.dss + dss-common + ${dss.version} + provided + + + + + + + + + org.apache.maven.plugins + maven-deploy-plugin + + + + net.alchim31.maven + scala-maven-plugin + + + org.apache.maven.plugins + maven-jar-plugin + + + org.apache.maven.plugins + maven-assembly-plugin + 2.3 + false + + + make-assembly + package + + single + + + + src/main/assembly/distribution.xml + + + + + + false + out + false + false + + src/main/assembly/distribution.xml + + + + + + + + + \ No newline at end of file diff --git a/dss-appconn/appconns/dss-schedulis-appconn/src/main/assembly/distribution.xml b/dss-appconn/appconns/dss-schedulis-appconn/src/main/assembly/distribution.xml new file mode 100644 index 000000000..36cd42577 --- /dev/null +++ b/dss-appconn/appconns/dss-schedulis-appconn/src/main/assembly/distribution.xml @@ -0,0 +1,76 @@ + + + + dss-schedulis-appconn + + dir + + true + schedulis + + + + + + lib + true + true + false + true + true + + + + + + ${basedir}/src/main/resources + + appconn.properties + + 0777 + / + unix + + + + ${basedir}/src/main/resources + + log4j.properties + log4j2.xml + + 0777 + conf + unix + + + + ${basedir}/src/main/resources + + init.sql + + 0777 + db + + + + + + + diff --git a/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/Action/FlowScheduleAction.java b/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/Action/FlowScheduleAction.java new file mode 100644 index 000000000..17611d6be --- /dev/null +++ b/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/Action/FlowScheduleAction.java @@ -0,0 +1,37 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.schedulis.Action; + +import com.webank.wedatasphere.linkis.httpclient.request.POSTAction; + +public class FlowScheduleAction extends POSTAction { + + private String url; + @Override + public String getRequestPayload() { + return ""; + } + + public void setURL(String url) { + this.url = url; + } + + @Override + public String getURL() { + return url; + } +} \ No newline at end of file diff --git a/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/Action/FlowScheduleGetAction.java b/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/Action/FlowScheduleGetAction.java new file mode 100644 index 000000000..e316579cf --- /dev/null +++ b/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/Action/FlowScheduleGetAction.java @@ -0,0 +1,47 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.schedulis.Action; + +import com.webank.wedatasphere.linkis.httpclient.request.GetAction; +import com.webank.wedatasphere.linkis.httpclient.request.UserAction; + +public class FlowScheduleGetAction extends GetAction implements UserAction { + + String url; + String user; + + @Override + public String getURL() { + return url; + } + + public void setURL(String url) { + this.url = url; + } + + @Override + public void setUser(String user) { + this.user = user; + } + + @Override + public String getUser() { + return user; + } + + +} diff --git a/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/Action/FlowScheduleUploadAction.java b/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/Action/FlowScheduleUploadAction.java new file mode 100644 index 000000000..844a7494f --- /dev/null +++ b/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/Action/FlowScheduleUploadAction.java @@ -0,0 +1,105 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.schedulis.Action; + +import com.webank.wedatasphere.dss.appconn.schedulis.sso.UserInfo; +import com.webank.wedatasphere.linkis.httpclient.request.BinaryBody; +import com.webank.wedatasphere.linkis.httpclient.request.POSTAction; +import com.webank.wedatasphere.linkis.httpclient.request.UploadAction; +import scala.Option; + + +import java.io.InputStream; +import java.util.ArrayList; +import java.util.HashMap; +import java.util.List; +import java.util.Map; + +public class FlowScheduleUploadAction extends POSTAction implements UploadAction, UserInfo { + private Map inputStreams; + private List binaryBodies; + private Map streamNames=new HashMap<>(); + + private String url; + private String user; + private ArrayList filePaths; + + public FlowScheduleUploadAction(List binaryBodies){ + this.filePaths=null; + this.binaryBodies = binaryBodies; + } + + @Override + public String getURL() { + return this.url; + } + + public void setURl(String url){ + this.url = url; + } + + @Override + public Map files() { + Map map = new HashMap<>(); + + if (null == filePaths || filePaths.size() == 0) { + return map; + } + else { + filePaths.stream().forEach( + filePath -> map.put("file", filePath)); + } + + return map; + } + + @Override + public Map inputStreams() { + return inputStreams; + } + + @Override + public Map inputStreamNames() { + return streamNames; + } + + @Override + public Option user() { + return null; + } + + + @Override + public String getRequestPayload() { + return null; + } + + @Override + public void setUser(String user) { + this.user = user; + } + + @Override + public String getUser() { + return user; + } + + @Override + public List binaryBodies() { + return binaryBodies; + } +} diff --git a/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/SchedulisAppConn.java b/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/SchedulisAppConn.java new file mode 100644 index 000000000..33f1a7e24 --- /dev/null +++ b/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/SchedulisAppConn.java @@ -0,0 +1,33 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.schedulis; + +import com.webank.wedatasphere.dss.appconn.scheduler.AbstractSchedulerAppConn; +import com.webank.wedatasphere.dss.appconn.schedulis.standard.SchedulisStructureStandard; +import com.webank.wedatasphere.dss.standard.app.structure.StructureIntegrationStandard; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; + +public class SchedulisAppConn extends AbstractSchedulerAppConn { + + public static final String SCHEDULIS_APPCONN_NAME = "Schedulis"; + + @Override + public StructureIntegrationStandard getOrCreateStructureStandard() { + return SchedulisStructureStandard.getInstance(); + } +} diff --git a/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/conf/AzkabanConf.java b/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/conf/AzkabanConf.java new file mode 100644 index 000000000..7bd7e58de --- /dev/null +++ b/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/conf/AzkabanConf.java @@ -0,0 +1,32 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.schedulis.conf; + + +import com.webank.wedatasphere.linkis.common.conf.CommonVars; + +public class AzkabanConf { + public static final CommonVars DEFAULT_STORE_PATH = CommonVars.apply("wds.dss.appconn.scheduler.project.store.dir", "/appcom/tmp/wds/dss"); + public static final CommonVars AZKABAN_LOGIN_PWD = CommonVars.apply("wds.dss.appconn.scheduler.azkaban.login.passwd", "userpwd"); + public static final CommonVars LINKIS_VERSION = CommonVars.apply("wds.dss.appconn.scheduler.linkis.version", "1.0.0"); + public static final CommonVars JOB_LABEL = CommonVars.apply("wds.dss.appconn.scheduler.job.label", "prod"); + + public static final CommonVars AZKABAN_RSA = + CommonVars.apply("wds.dss.appconn.scheduler.azkaban.rsa", + "iVGljygYsZvrNFCYqOpocm4Kpp7rJudeKomnM9FImWI4yNjdtmOt43Q1Brb7MFgzjRLIdJWJ0Ui760pDZpXPHOz81ctsj553E5cdKu7cg+h5C2AkBEZ6AvEQ+oet7ukwg+7ASSBuQufLkAxHKDqCjq/XxsC/MH11pkuHKaJpSTY="); + +} diff --git a/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/constant/AzkabanConstant.java b/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/constant/AzkabanConstant.java new file mode 100644 index 000000000..92b73191d --- /dev/null +++ b/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/constant/AzkabanConstant.java @@ -0,0 +1,32 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.schedulis.constant; + +public class AzkabanConstant { + public final static String LINKIS_FLOW_VARIABLE_KEY = "flow.variable."; + public final static String AZKABAN_JOB_SUFFIX = ".job"; + public final static String AZKABAN_PROPERTIES_SUFFIX = ".properties"; + public final static String LINKIS_JOB_RESOURCES_KEY = "resources="; + public final static String ZAKABAN_DEPENDENCIES_KEY = "dependencies"; + public final static String JOB_TYPE = "type"; + public final static String JOB_LABELS = "labels"; + public final static String LINKIS_TYPE = "linkistype"; + public final static String JOB_COMMAND = "command"; + public final static String FLOW_CONTEXT_ID = "wds.linkis.flow.contextID="; + public final static String LINKIS_VERSION = "linkis.version"; + +} diff --git a/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/conversion/AzkabanWorkflowToRelSynchronizer.java b/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/conversion/AzkabanWorkflowToRelSynchronizer.java new file mode 100644 index 000000000..90e06d22e --- /dev/null +++ b/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/conversion/AzkabanWorkflowToRelSynchronizer.java @@ -0,0 +1,110 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.schedulis.conversion; + +import com.webank.wedatasphere.dss.appconn.schedulis.Action.FlowScheduleUploadAction; +import com.webank.wedatasphere.dss.appconn.schedulis.entity.AzkabanConvertedRel; +import com.webank.wedatasphere.dss.appconn.schedulis.utils.SSORequestWTSS; +import com.webank.wedatasphere.dss.common.exception.DSSRuntimeException; +import com.webank.wedatasphere.dss.common.utils.ZipHelper; +import com.webank.wedatasphere.dss.standard.app.sso.Workspace; +import com.webank.wedatasphere.dss.standard.app.sso.request.SSORequestService; +import com.webank.wedatasphere.dss.standard.common.desc.AppInstance; +import com.webank.wedatasphere.dss.workflow.conversion.entity.ConvertedRel; +import com.webank.wedatasphere.dss.workflow.conversion.operation.WorkflowToRelSynchronizer; +import com.webank.wedatasphere.linkis.common.exception.ErrorException; +import com.webank.wedatasphere.linkis.httpclient.request.BinaryBody; +import com.webank.wedatasphere.linkis.httpclient.response.HttpResult; +import java.io.File; +import java.io.FileInputStream; +import java.io.InputStream; +import java.util.ArrayList; +import java.util.HashMap; +import java.util.List; +import java.util.Map; +import org.apache.commons.io.IOUtils; +import org.apache.commons.lang.exception.ExceptionUtils; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; + +public class AzkabanWorkflowToRelSynchronizer implements WorkflowToRelSynchronizer { + + public static final Logger LOGGER = LoggerFactory.getLogger(AzkabanWorkflowToRelSynchronizer.class); + + private String projectUrl; + private SSORequestService ssoRequestService; + + @Override + public void setAppInstance(AppInstance appInstance) { + this.projectUrl = appInstance.getBaseUrl().endsWith("/") ? appInstance.getBaseUrl() + "manager": + appInstance.getBaseUrl() + "/manager"; + } + + @Override + public void syncToRel(ConvertedRel convertedRel) { + String tmpSavePath; + AzkabanConvertedRel azkabanConvertedRel = (AzkabanConvertedRel) convertedRel; + try { + String projectPath = azkabanConvertedRel.getStorePath(); + tmpSavePath = ZipHelper.zip(projectPath); + //upload zip to Azkaban + uploadProject(azkabanConvertedRel.getDSSToRelConversionRequestRef().getWorkspace(), tmpSavePath, + azkabanConvertedRel.getDSSToRelConversionRequestRef().getDSSProject().getName(), azkabanConvertedRel.getDSSToRelConversionRequestRef().getUserName()); + } catch (Exception e) { + throw new DSSRuntimeException(90012, ExceptionUtils.getRootCauseMessage(e), e); + } + } + + @Override + public void setSSORequestService(SSORequestService ssoRequestService) { + this.ssoRequestService = ssoRequestService; + } + + private void uploadProject(Workspace workspace, String tmpSavePath, String projectName, String releaseUser) throws Exception { + + File file = new File(tmpSavePath); + InputStream inputStream = new FileInputStream(file); + try { + BinaryBody binaryBody =BinaryBody.apply("file",inputStream,file.getName(),"application/zip"); + List binaryBodyList =new ArrayList<>(); + binaryBodyList.add(binaryBody); + FlowScheduleUploadAction uploadAction = new FlowScheduleUploadAction(binaryBodyList); + uploadAction.getFormParams().put("ajax", "upload"); + uploadAction.getFormParams().put("project", projectName); + + uploadAction.getParameters().put("project", projectName); + uploadAction.getParameters().put("ajax", "upload"); + uploadAction.setURl(projectUrl); + + + HttpResult response = SSORequestWTSS.requestWTSSWithSSOUpload(projectUrl,uploadAction,this.ssoRequestService,workspace); + + if (response.getStatusCode() == 200 || response.getStatusCode()==0) { + LOGGER.info("upload project:{} success!", projectName); + }else{ + LOGGER.error("调用azkaban上传接口的返回不为200, status code 是 {}", response.getStatusCode()); + throw new ErrorException(90013, "release project failed, " + response.getResponseBody()); + } + + } catch (Exception e) { + LOGGER.error("upload failed,reason:", e); + throw new ErrorException(90014, e.getMessage()); + } finally { + IOUtils.closeQuietly(inputStream); + } + } +} diff --git a/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/conversion/NodeConverter.java b/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/conversion/NodeConverter.java new file mode 100644 index 000000000..605c61154 --- /dev/null +++ b/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/conversion/NodeConverter.java @@ -0,0 +1,25 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.schedulis.conversion; + +import com.webank.wedatasphere.dss.workflow.core.entity.WorkflowNode; + +public interface NodeConverter { + + String conversion(WorkflowNode workflowNode); + +} diff --git a/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/conversion/ProjectInfoWorkflowToRelConverter.java b/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/conversion/ProjectInfoWorkflowToRelConverter.java new file mode 100644 index 000000000..681a695d6 --- /dev/null +++ b/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/conversion/ProjectInfoWorkflowToRelConverter.java @@ -0,0 +1,109 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.schedulis.conversion; + +import com.webank.wedatasphere.dss.appconn.schedulis.conf.AzkabanConf; +import com.webank.wedatasphere.dss.appconn.schedulis.entity.AzkabanConvertedRel; +import com.webank.wedatasphere.dss.appconn.schedulis.entity.AzkabanWorkflow; +import com.webank.wedatasphere.dss.appconn.schedulis.utils.AzkabanUtilsScala; +import com.webank.wedatasphere.dss.common.exception.DSSRuntimeException; +import com.webank.wedatasphere.dss.workflow.conversion.entity.ConvertedRel; +import com.webank.wedatasphere.dss.workflow.conversion.entity.PreConversionRel; +import com.webank.wedatasphere.dss.workflow.conversion.operation.WorkflowToRelConverter; +import com.webank.wedatasphere.dss.workflow.core.entity.Workflow; +import java.io.File; +import java.text.SimpleDateFormat; +import java.util.ArrayList; +import java.util.Date; +import java.util.List; +import org.apache.commons.io.FileUtils; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; + + +public class ProjectInfoWorkflowToRelConverter implements WorkflowToRelConverter { + + private static final Logger LOGGER = LoggerFactory.getLogger(ProjectInfoWorkflowToRelConverter.class); + + @Override + public ConvertedRel convertToRel(PreConversionRel rel) { + List repeatNode = AzkabanUtilsScala.getRepeatNodeName(getAllNodeName(rel.getWorkflows())); + if (!repeatNode.isEmpty()) { + throw new DSSRuntimeException(80001, "重复的节点名称:" + repeatNode.toString()); + } + AzkabanConvertedRel azkabanConvertedRel = new AzkabanConvertedRel(rel); + //1. Assign a value to the storepath of azkabanschedulerproject. + assignStorePath(azkabanConvertedRel); + //2. The storepath of his rootflow is also assigned a value. + List workflows = rel.getWorkflows(); + workflows.forEach(flow -> setRootFlowStorePath(azkabanConvertedRel.getStorePath(), flow)); + //3. Delete zip packages and folders that may not have been processed. + removeProjectStoreDirAndZip(azkabanConvertedRel); + return azkabanConvertedRel; + } + + private void setRootFlowStorePath(String projectStorePath, Workflow workflow){ + AzkabanWorkflow azkabanWorkflow = (AzkabanWorkflow) workflow; + azkabanWorkflow.setStorePath(projectStorePath + File.separator + workflow.getName()); + } + + private void assignStorePath(AzkabanConvertedRel rel) { + SimpleDateFormat dateFormat = new SimpleDateFormat(AzkabanConvertedRel.DATE_FORMAT); + Date date = new Date(); + String dateStr = dateFormat.format(date); + String userName = rel.getDSSToRelConversionRequestRef().getDSSProject().getUsername(); + String name = rel.getDSSToRelConversionRequestRef().getDSSProject().getName(); + String storePath = AzkabanConf.DEFAULT_STORE_PATH.getValue() + File.separator + userName + + File.separator + dateStr + File.separator +name; + rel.setStorePath(storePath); + } + + private void removeProjectStoreDirAndZip(AzkabanConvertedRel rel) { + String storePath = rel.getStorePath(); + File projectDir = new File(storePath); + try { + if (projectDir.exists()) { + LOGGER.info("exist project dir{} before publish ,now remove it", storePath); + FileUtils.deleteDirectory(projectDir); + } + String projectZip = projectDir.getParent() + File.separator + + rel.getDSSToRelConversionRequestRef().getDSSProject().getName() + ".zip"; + File zipFile = new File(projectZip); + if (zipFile.exists()) { + LOGGER.info("exist project zip{} before publish ,now remove it", projectZip); + zipFile.delete(); + } + } catch (Exception e) { + LOGGER.error("delete project dir or zip failed,reaseon:", e); + throw new DSSRuntimeException(90020, e.getMessage()); + } + } + + /** + * Get the names of all nodes directly from allflows without recursion. + */ + private List getAllNodeName(List workflows) { + List nodeNames = new ArrayList<>(); + workflows.forEach(flow -> flow.getWorkflowNodes().forEach(node -> nodeNames.add(node.getName()))); + return nodeNames; + } + + @Override + public int getOrder() { + return 5; + } +} diff --git a/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/conversion/WorkflowToAzkbanNodeRelConverter.java b/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/conversion/WorkflowToAzkbanNodeRelConverter.java new file mode 100644 index 000000000..dcb02ad85 --- /dev/null +++ b/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/conversion/WorkflowToAzkbanNodeRelConverter.java @@ -0,0 +1,111 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.schedulis.conversion; + +import com.google.gson.Gson; +import com.webank.wedatasphere.dss.appconn.schedulis.constant.AzkabanConstant; +import com.webank.wedatasphere.dss.appconn.schedulis.entity.AzkabanWorkflow; +import com.webank.wedatasphere.dss.appconn.schedulis.linkisjob.LinkisJobConverter; +import com.webank.wedatasphere.dss.common.entity.Resource; +import com.webank.wedatasphere.dss.common.exception.DSSErrorException; +import com.webank.wedatasphere.dss.common.utils.ClassUtils; +import com.webank.wedatasphere.dss.common.utils.DSSExceptionUtils; +import com.webank.wedatasphere.dss.workflow.conversion.entity.ConvertedRel; +import com.webank.wedatasphere.dss.workflow.conversion.entity.PreConversionRel; +import com.webank.wedatasphere.dss.workflow.conversion.operation.WorkflowToRelConverter; +import com.webank.wedatasphere.dss.workflow.core.entity.Workflow; +import com.webank.wedatasphere.dss.workflow.core.entity.WorkflowNode; +import java.io.File; +import java.io.FileOutputStream; +import java.util.List; +import org.apache.commons.io.FileUtils; +import org.apache.commons.io.IOUtils; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; + + +public class WorkflowToAzkbanNodeRelConverter implements WorkflowToRelConverter { + + public static final Logger LOGGER = LoggerFactory.getLogger(WorkflowToAzkbanNodeRelConverter.class); + + private NodeConverter nodeConverter; + + public WorkflowToAzkbanNodeRelConverter() { + nodeConverter = ClassUtils.getInstanceOrDefault(NodeConverter.class, new LinkisJobConverter()); + } + + @Override + public ConvertedRel convertToRel(PreConversionRel rel) { + rel.getWorkflows().forEach(this::convertNode); + return (ConvertedRel) rel; + } + + private void convertNode(Workflow workflow) { + workflow.getWorkflowNodes().forEach(DSSExceptionUtils.handling(workflowNode -> { + String nodeStorePath = getNodeStorePath(((AzkabanWorkflow)workflow).getStorePath(), workflowNode); + writeNodeToJobLocal(workflowNode, nodeStorePath); + writeNodeResourcesToLocal(workflowNode, nodeStorePath); + })); + if(workflow.getChildren() != null) { + workflow.getChildren().forEach(flow -> convertNode((Workflow) flow)); + } + } + + private String getNodeStorePath(String flowStorePath, WorkflowNode schedulerNode) { + return flowStorePath + File.separator + "jobs" + File.separator + schedulerNode.getName(); + } + + private void writeNodeToJobLocal(WorkflowNode workflowNode, String storePath) throws DSSErrorException { + FileOutputStream os = null; + try { + File jobDirFile = new File(storePath); + FileUtils.forceMkdir(jobDirFile); + File jobFile = new File(storePath,workflowNode.getName() + AzkabanConstant.AZKABAN_JOB_SUFFIX); + jobFile.createNewFile(); + String nodeString = nodeConverter.conversion(workflowNode); + os = FileUtils.openOutputStream(jobFile,true); + os.write(nodeString.getBytes()); + }catch (Exception e){ + LOGGER.error("write AppConnNode to jobLocal failed,reason:",e); + throw new DSSErrorException(90017,e.getMessage()); + } finally { + IOUtils.closeQuietly(os); + } + } + + private void writeNodeResourcesToLocal(WorkflowNode workflowNode, String storePath) throws DSSErrorException { + List nodeResources = workflowNode.getDSSNode().getResources(); + if(nodeResources == null || nodeResources.isEmpty()) {return;} + FileOutputStream os = null; + try { + File jobFile = new File(storePath,workflowNode.getName() + AzkabanConstant.AZKABAN_JOB_SUFFIX); + String nodeResourceString = AzkabanConstant.LINKIS_JOB_RESOURCES_KEY + new Gson().toJson(nodeResources); + os = FileUtils.openOutputStream(jobFile,true); + os.write(nodeResourceString.getBytes()); + }catch (Exception e){ + LOGGER.error("write nodeResources to local failed,reason:",e); + throw new DSSErrorException(90018,e.getMessage()); + }finally { + IOUtils.closeQuietly(os); + } + } + + @Override + public int getOrder() { + return 100; + } +} diff --git a/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/conversion/WorkflowToAzkbanRelConverter.java b/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/conversion/WorkflowToAzkbanRelConverter.java new file mode 100644 index 000000000..a413d9d93 --- /dev/null +++ b/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/conversion/WorkflowToAzkbanRelConverter.java @@ -0,0 +1,142 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.schedulis.conversion; + +import com.google.gson.Gson; +import com.webank.wedatasphere.dss.appconn.schedulis.constant.AzkabanConstant; +import com.webank.wedatasphere.dss.appconn.schedulis.entity.AzkabanConvertedRel; +import com.webank.wedatasphere.dss.appconn.schedulis.entity.AzkabanWorkflow; +import com.webank.wedatasphere.dss.common.entity.Resource; +import com.webank.wedatasphere.dss.common.exception.DSSErrorException; +import com.webank.wedatasphere.dss.common.utils.DSSExceptionUtils; +import com.webank.wedatasphere.dss.workflow.common.entity.Flow; +import com.webank.wedatasphere.dss.workflow.conversion.entity.ConvertedRel; +import com.webank.wedatasphere.dss.workflow.conversion.entity.PreConversionRel; +import com.webank.wedatasphere.dss.workflow.conversion.operation.WorkflowToRelConverter; +import com.webank.wedatasphere.dss.workflow.core.entity.Workflow; +import java.io.File; +import java.io.FileOutputStream; +import java.util.List; +import java.util.Map; +import org.apache.commons.io.FileUtils; +import org.apache.commons.io.IOUtils; +import org.apache.commons.lang.StringUtils; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; + +public class WorkflowToAzkbanRelConverter implements WorkflowToRelConverter { + + public static final Logger LOGGER = LoggerFactory.getLogger(WorkflowToAzkbanRelConverter.class); + + @Override + public ConvertedRel convertToRel(PreConversionRel rel) { + AzkabanConvertedRel azkabanConvertedRel = (AzkabanConvertedRel) rel; + azkabanConvertedRel.getWorkflows().forEach(DSSExceptionUtils.handling(workflow -> { + //1. Set sub flow and node storage paths + String flowStorePath = ((AzkabanWorkflow) workflow).getStorePath(); + if (workflow.getChildren() != null) { + workflow.getChildren().forEach(flow -> setFlowStorePath(flowStorePath, flow)); + } + // 2. Processing resources, generating files, and so on. + writeWorkflowFiles(workflow, azkabanConvertedRel.getStorePath()); + })); + return azkabanConvertedRel; + } + + private void writeWorkflowFiles(Flow workflow, String projectStorePath) throws DSSErrorException { + AzkabanWorkflow flow = (AzkabanWorkflow) workflow; + writeFlowResourcesToLocal(flow, projectStorePath); + writeFlowPropertiesToLocal(flow); + if (workflow.getChildren() != null) { + workflow.getChildren().forEach(DSSExceptionUtils.handling(f -> writeWorkflowFiles(f, projectStorePath))); + } + } + + private void setFlowStorePath(String flowStorePath, Flow workflow) { + AzkabanWorkflow azkabanWorkflow = (AzkabanWorkflow) workflow; + azkabanWorkflow.setStorePath(flowStorePath + File.separator + "subFlows" + File.separator + azkabanWorkflow.getName()); + if (workflow.getChildren() != null) { + workflow.getChildren().forEach(flow -> setFlowStorePath(azkabanWorkflow.getStorePath(), flow)); + } + } + + private void writeFlowResourcesToLocal(AzkabanWorkflow flow, String projectStorePath) throws DSSErrorException { + List flowResources = flow.getFlowResources(); + FileOutputStream os = null; + try { + String storePath = flow.getStorePath(); + File flowDir = new File(storePath); + FileUtils.forceMkdir(flowDir); + if (flowResources == null || flowResources.isEmpty()) { + return; + } + String flowResourceStringPrefix = getFlowResourceStringPrefix(projectStorePath, storePath); + String flowResourceString = flowResourceStringPrefix + new Gson().toJson(flowResources) + "\n"; + File projectResourcesFile = new File(projectStorePath, "project.properties"); + os = FileUtils.openOutputStream(projectResourcesFile, true); + os.write(flowResourceString.getBytes()); + } catch (Exception e) { + LOGGER.error("write FlowResources to local failed,reason:", e); + throw new DSSErrorException(90006, e.getMessage()); + } finally { + IOUtils.closeQuietly(os); + } + } + + private String getFlowResourceStringPrefix(String projectStorePath, String storePath) { + String substring = storePath.substring(projectStorePath.length() + 1); + String prefix = substring.replaceAll("\\" + File.separator + "subFlows" + "\\" + File.separator, "."); + return "flow." + prefix + "_.resources="; + } + + private void writeFlowPropertiesToLocal(AzkabanWorkflow flow) throws DSSErrorException { + List> flowProperties = flow.getFlowProperties(); + if (flowProperties == null || flowProperties.isEmpty()) { + return; + } + FileOutputStream os = null; + try { + String storePath = flow.getStorePath(); + File flowPrpsFile = new File(storePath, flow.getName() + AzkabanConstant.AZKABAN_PROPERTIES_SUFFIX); + flowPrpsFile.createNewFile(); + os = FileUtils.openOutputStream(flowPrpsFile, true); + StringBuilder stringBuilder = new StringBuilder(); + flowProperties.forEach(p -> p.forEach((k, v) -> { + stringBuilder.append(AzkabanConstant.LINKIS_FLOW_VARIABLE_KEY + k + "=" + v + "\n"); + })); + // update by peaceWong add contextID to Flow properties + String contextID = flow.getContextID(); + if (StringUtils.isNotBlank(contextID)) { + contextID = contextID.replace("\\", "/"); + LOGGER.info("after replace contextID is {}", contextID); + stringBuilder.append(AzkabanConstant.FLOW_CONTEXT_ID + contextID + "\n"); + } + // update end + os.write(stringBuilder.toString().getBytes()); + } catch (Exception e) { + LOGGER.error("write flowProperties to local faailed,reason:", e); + throw new DSSErrorException(90007, e.getMessage()); + } finally { + IOUtils.closeQuietly(os); + } + } + + @Override + public int getOrder() { + return 10; + } +} diff --git a/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/entity/AzkabanConvertedRel.java b/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/entity/AzkabanConvertedRel.java new file mode 100644 index 000000000..09c3afc9a --- /dev/null +++ b/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/entity/AzkabanConvertedRel.java @@ -0,0 +1,53 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.schedulis.entity; + +import com.webank.wedatasphere.dss.orchestrator.converter.standard.ref.DSSToRelConversionRequestRef; +import com.webank.wedatasphere.dss.orchestrator.converter.standard.ref.ProjectToRelConversionRequestRef; +import com.webank.wedatasphere.dss.workflow.conversion.entity.ConvertedRel; +import com.webank.wedatasphere.dss.workflow.conversion.entity.PreConversionRel; +import com.webank.wedatasphere.dss.workflow.conversion.entity.PreConversionRelImpl; + +public class AzkabanConvertedRel extends PreConversionRelImpl implements ConvertedRel { + + private String storePath; + public static final String DATE_FORMAT = "yyyyMMddHHmmss"; + + public AzkabanConvertedRel(PreConversionRel rel) { + setWorkflows(rel.getWorkflows()); + setDSSToRelConversionRequestRef(rel.getDSSToRelConversionRequestRef()); + } + + public AzkabanConvertedRel(AzkabanConvertedRel rel) { + this((PreConversionRel) rel); + storePath = rel.getStorePath(); + } + + @Override + public ProjectToRelConversionRequestRef getDSSToRelConversionRequestRef() { + return (ProjectToRelConversionRequestRef) super.getDSSToRelConversionRequestRef(); + } + + public String getStorePath() { + return storePath; + } + + public void setStorePath(String storePath) { + this.storePath = storePath; + } + +} diff --git a/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/entity/AzkabanUserEntity.java b/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/entity/AzkabanUserEntity.java new file mode 100644 index 000000000..d4e839fd6 --- /dev/null +++ b/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/entity/AzkabanUserEntity.java @@ -0,0 +1,47 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.schedulis.entity; + +public class AzkabanUserEntity { + private String id; + private String text; + private String username; + + public String getId() { + return id; + } + + public void setId(String id) { + this.id = id; + } + + public String getText() { + return text; + } + + public void setText(String text) { + this.text = text; + } + + public String getUsername() { + return username; + } + + public void setUsername(String username) { + this.username = username; + } +} diff --git a/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/entity/AzkabanWorkflow.java b/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/entity/AzkabanWorkflow.java new file mode 100644 index 000000000..1387db767 --- /dev/null +++ b/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/entity/AzkabanWorkflow.java @@ -0,0 +1,32 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.schedulis.entity; + +import com.webank.wedatasphere.dss.workflow.core.entity.WorkflowWithContextImpl; + +public class AzkabanWorkflow extends WorkflowWithContextImpl { + + private String storePath; + + public String getStorePath() { + return storePath; + } + + public void setStorePath(String storePath) { + this.storePath = storePath; + } +} diff --git a/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/linkisjob/AzkabanSubFlowJobTuning.java b/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/linkisjob/AzkabanSubFlowJobTuning.java new file mode 100644 index 000000000..b083be21a --- /dev/null +++ b/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/linkisjob/AzkabanSubFlowJobTuning.java @@ -0,0 +1,33 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.schedulis.linkisjob; + +public class AzkabanSubFlowJobTuning implements LinkisJobTuning { + + @Override + public LinkisJob tuningJob(LinkisJob job) { + job.setType("flow"); + job.setLinkistype(null); + job.getConf().put("flow.name",job.getName() + "_"); + return job; + } + + @Override + public boolean ifJobCantuning(String nodeType) { + return "workflow.subflow".equals(nodeType); + } +} diff --git a/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/linkisjob/LinkisJob.java b/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/linkisjob/LinkisJob.java new file mode 100644 index 000000000..5c68f9676 --- /dev/null +++ b/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/linkisjob/LinkisJob.java @@ -0,0 +1,86 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.schedulis.linkisjob; + +import java.util.Map; + +public class LinkisJob { + private String name; + private String type; + private String linkistype; + private String proxyUser; + private String dependencies; + private Map conf; + private String command; + + public String getName() { + return name; + } + + public void setName(String name) { + this.name = name; + } + + public String getType() { + return type; + } + + public void setType(String type) { + this.type = type; + } + + public String getLinkistype() { + return linkistype; + } + + public void setLinkistype(String linkistype) { + this.linkistype = linkistype; + } + + public String getProxyUser() { + return proxyUser; + } + + public void setProxyUser(String proxyUser) { + this.proxyUser = proxyUser; + } + + public String getDependencies() { + return dependencies; + } + + public void setDependencies(String dependencies) { + this.dependencies = dependencies; + } + + public Map getConf() { + return conf; + } + + public void setConf(Map conf) { + this.conf = conf; + } + + public String getCommand() { + return command; + } + + public void setCommand(String command) { + this.command = command; + } + +} diff --git a/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/linkisjob/LinkisJobConverter.java b/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/linkisjob/LinkisJobConverter.java new file mode 100644 index 000000000..69b2cbdde --- /dev/null +++ b/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/linkisjob/LinkisJobConverter.java @@ -0,0 +1,127 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.schedulis.linkisjob; + +import com.google.gson.Gson; +import com.webank.wedatasphere.dss.appconn.schedulis.conf.AzkabanConf; +import com.webank.wedatasphere.dss.appconn.schedulis.constant.AzkabanConstant; +import com.webank.wedatasphere.dss.appconn.schedulis.conversion.NodeConverter; +import com.webank.wedatasphere.dss.workflow.core.constant.WorkflowConstant; +import com.webank.wedatasphere.dss.workflow.core.entity.WorkflowNode; +import java.util.Arrays; +import java.util.HashMap; +import java.util.List; +import java.util.Map; +import org.apache.commons.lang.StringUtils; + +public class LinkisJobConverter implements NodeConverter { + + public LinkisJobConverter(){ + LinkisJobTuning[] linkisJobTunings = {new AzkabanSubFlowJobTuning()}; + this.linkisJobTunings = linkisJobTunings; + } + private LinkisJobTuning[] linkisJobTunings; + + @Override + public String conversion(WorkflowNode workflowNode){ + return baseConversion(workflowNode); + } + + private String baseConversion(WorkflowNode workflowNode){ + LinkisJob job = new LinkisJob(); + job.setConf(new HashMap()); + job.setName(workflowNode.getName()); + convertHead(workflowNode,job); + convertDependencies(workflowNode,job); + convertProxyUser(workflowNode,job); + convertConfiguration(workflowNode,job); + convertJobCommand(workflowNode,job); + Arrays.stream(linkisJobTunings).forEach(t ->{ + if(t.ifJobCantuning(workflowNode.getNodeType())) { + t.tuningJob(job); + } + }); + return convertJobToString(job); + } + + private String convertJobToString(LinkisJob job){ + HashMap map = new HashMap<>(); + map.put(AzkabanConstant.LINKIS_VERSION, AzkabanConf.LINKIS_VERSION.getValue()); + map.put(AzkabanConstant.JOB_TYPE,job.getType()); + map.put(AzkabanConstant.LINKIS_TYPE,job.getLinkistype()); + map.put(AzkabanConstant.ZAKABAN_DEPENDENCIES_KEY,job.getDependencies()); + map.put(WorkflowConstant.PROXY_USER,job.getProxyUser()); + map.put(AzkabanConstant.JOB_COMMAND,job.getCommand()); + Map labels = new HashMap(); + labels.put("route",AzkabanConf.JOB_LABEL.getValue()); + map.put(AzkabanConstant.JOB_LABELS, new Gson().toJson(labels)); + map.putAll(job.getConf()); + StringBuilder stringBuilder = new StringBuilder(); + map.forEach((k,v)->{ + if(v != null) { + stringBuilder.append(k).append("=").append(v).append("\n"); + } + }); + return stringBuilder.toString(); + } + + private void convertHead(WorkflowNode workflowNode, LinkisJob job){ + job.setType("linkis"); + job.setLinkistype(workflowNode.getNodeType()); + } + + private void convertDependencies(WorkflowNode workflowNode, LinkisJob job){ + List dependencys = workflowNode.getDSSNode().getDependencys(); + if(dependencys != null && !dependencys.isEmpty()) { + StringBuilder dependencies = new StringBuilder(); + dependencys.forEach(d ->dependencies.append(d + ",")); + job.setDependencies(dependencies.substring(0,dependencies.length()-1)); + } + } + + private void convertProxyUser(WorkflowNode workflowNode, LinkisJob job){ + String userProxy = workflowNode.getDSSNode().getUserProxy(); + if(!StringUtils.isEmpty(userProxy)) { + job.setProxyUser(userProxy); + } + } + + private void convertConfiguration(WorkflowNode workflowNode, LinkisJob job){ + Map params = workflowNode.getDSSNode().getParams(); + if (params != null && !params.isEmpty()) { + Object configuration = params.get("configuration"); + String confprefix = "node.conf."; + ((Map>)configuration).forEach((k,v)-> + { + if(null!=v) { + v.forEach((k2, v2) -> { + if(null !=v2) {job.getConf().put(confprefix + k + "." + k2, v2.toString());} + }); + } + }); + } + + } + + private void convertJobCommand(WorkflowNode workflowNode, LinkisJob job){ + Map jobContent = workflowNode.getDSSNode().getJobContent(); + if(jobContent != null) { + jobContent.remove("jobParams"); + job.setCommand(new Gson().toJson(jobContent)); + } + } +} diff --git a/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/linkisjob/LinkisJobTuning.java b/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/linkisjob/LinkisJobTuning.java new file mode 100644 index 000000000..5f76cd64b --- /dev/null +++ b/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/linkisjob/LinkisJobTuning.java @@ -0,0 +1,24 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.schedulis.linkisjob; + +public interface LinkisJobTuning { + + LinkisJob tuningJob(LinkisJob job); + + boolean ifJobCantuning(String nodeType); +} diff --git a/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/operation/SchedulisProjectCreationOperation.java b/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/operation/SchedulisProjectCreationOperation.java new file mode 100644 index 000000000..05671d236 --- /dev/null +++ b/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/operation/SchedulisProjectCreationOperation.java @@ -0,0 +1,126 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.schedulis.operation; + +import com.webank.wedatasphere.dss.appconn.schedulis.ref.SchedulisProjectResponseRef; +import com.webank.wedatasphere.dss.appconn.schedulis.service.SchedulisProjectService; +import com.webank.wedatasphere.dss.appconn.schedulis.utils.AzkabanUtils; +import com.webank.wedatasphere.dss.appconn.schedulis.utils.SSORequestWTSS; +import com.webank.wedatasphere.dss.appconn.schedulis.utils.SchedulisExceptionUtils; +import com.webank.wedatasphere.dss.standard.app.structure.StructureService; +import com.webank.wedatasphere.dss.standard.app.structure.project.ProjectCreationOperation; +import com.webank.wedatasphere.dss.standard.app.structure.project.ProjectRequestRef; +import com.webank.wedatasphere.dss.standard.app.structure.project.ProjectResponseRef; +import com.webank.wedatasphere.dss.standard.app.structure.project.ProjectService; +import com.webank.wedatasphere.dss.standard.common.exception.operation.ExternalOperationFailedException; +import org.codehaus.jackson.JsonNode; +import org.codehaus.jackson.map.ObjectMapper; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; + +import java.util.HashMap; +import java.util.Map; + +public class SchedulisProjectCreationOperation implements ProjectCreationOperation { + + private static final Logger LOGGER = LoggerFactory.getLogger(SchedulisProjectCreationOperation.class); + + private ProjectService schedulisProjectService; + + private String projectUrl; + + private String managerUrl; + + private static Long DEFAULT_PROJECT_ID = 0L; + + + public SchedulisProjectCreationOperation() { + } + + @Override + public void init() { + this.projectUrl = this.schedulisProjectService.getAppInstance().getBaseUrl().endsWith("/") ? + this.schedulisProjectService.getAppInstance().getBaseUrl() + "manager" : + this.schedulisProjectService.getAppInstance().getBaseUrl() + "/manager"; + managerUrl = this.schedulisProjectService.getAppInstance().getBaseUrl().endsWith("/") ? this.schedulisProjectService.getAppInstance().getBaseUrl() + "manager" : + this.schedulisProjectService.getAppInstance().getBaseUrl() + "/manager"; + } + + + @Override + public ProjectResponseRef createProject(ProjectRequestRef requestRef) throws ExternalOperationFailedException { + LOGGER.info("begin to create project in schedulis project is {}", requestRef.getName()); + SchedulisProjectResponseRef responseRef = new SchedulisProjectResponseRef(); + Map params = new HashMap<>(); + params.put("action", "create"); + params.put("name", requestRef.getName()); + params.put("description", requestRef.getDescription()); + try { + + String entStr = SSORequestWTSS.requestWTSSWithSSOPost(projectUrl,params,this.schedulisProjectService,requestRef.getWorkspace()); + LOGGER.error("新建工程 {}, azkaban 返回的信息是 {}", requestRef.getName(), entStr); + String message = AzkabanUtils.handleAzkabanEntity(entStr); + if (!"success".equals(message)) { + throw new ExternalOperationFailedException(90008, "新建工程失败, 原因:" + message); + } + + } catch (final Exception t) { + LOGGER.error("Failed to create project!",t); + } + try { + DEFAULT_PROJECT_ID = getSchedulisProjectId(requestRef.getName(),requestRef); + } catch (Exception e) { + SchedulisExceptionUtils.dealErrorException(60051, "failed to get project id", e, + ExternalOperationFailedException.class); + } + + responseRef.setProjectRefId(DEFAULT_PROJECT_ID); + // There is no project ID returned in schedulis, so there is no need to set. + // Other exceptions are thrown out, so the correct code returned as 0 is OK. + return responseRef; + } + + @Override + public void setStructureService(StructureService service) { + this.schedulisProjectService = (SchedulisProjectService) service; + } + + /** + * Get project ID. + */ + public Long getSchedulisProjectId(String projectName, ProjectRequestRef requestRef) throws Exception { + + Map params = new HashMap<>(); + params.put("ajax", "getProjectId"); + params.put("project", projectName); + + long projectId = 0L; + try { + String content = SSORequestWTSS.requestWTSSWithSSOGet(this.managerUrl, params, this.schedulisProjectService.getSSORequestService(), requestRef.getWorkspace()); + LOGGER.info("Get schedulis project id return str is " + content); + ObjectMapper objectMapper = new ObjectMapper(); + JsonNode jsonNode = objectMapper.readValue(content, JsonNode.class); + projectId = jsonNode.get("projectId").getLongValue(); + } catch (final Throwable t) { + SchedulisExceptionUtils.dealErrorException(60051, "failed to create project in schedulis", t, + ExternalOperationFailedException.class); + } + return projectId; + } + + +} diff --git a/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/operation/SchedulisProjectDeletionOperation.java b/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/operation/SchedulisProjectDeletionOperation.java new file mode 100644 index 000000000..0163655b8 --- /dev/null +++ b/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/operation/SchedulisProjectDeletionOperation.java @@ -0,0 +1,70 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.schedulis.operation; + +import com.webank.wedatasphere.dss.appconn.schedulis.ref.SchedulisProjectResponseRef; +import com.webank.wedatasphere.dss.appconn.schedulis.utils.SSORequestWTSS; +import com.webank.wedatasphere.dss.appconn.schedulis.utils.SchedulisExceptionUtils; +import com.webank.wedatasphere.dss.standard.app.structure.StructureService; +import com.webank.wedatasphere.dss.standard.app.structure.project.ProjectDeletionOperation; +import com.webank.wedatasphere.dss.standard.app.structure.project.ProjectRequestRef; +import com.webank.wedatasphere.dss.standard.app.structure.project.ProjectResponseRef; +import com.webank.wedatasphere.dss.standard.app.structure.project.ProjectService; +import com.webank.wedatasphere.dss.standard.common.exception.operation.ExternalOperationFailedException; +import java.util.HashMap; +import java.util.Map; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; + +public class SchedulisProjectDeletionOperation implements ProjectDeletionOperation { + + private static final Logger LOGGER = LoggerFactory.getLogger(SchedulisProjectDeletionOperation.class); + + private ProjectService schedulisProjectService; + private String managerUrl; + + public SchedulisProjectDeletionOperation(){ + } + + @Override + public void init() { + managerUrl = this.schedulisProjectService.getAppInstance().getBaseUrl().endsWith("/") ? this.schedulisProjectService.getAppInstance().getBaseUrl() + "manager" : + this.schedulisProjectService.getAppInstance().getBaseUrl() + "/manager"; + } + + @Override + public void setStructureService(StructureService service) { + schedulisProjectService = (ProjectService) service; + init(); + } + + @Override + public ProjectResponseRef deleteProject(ProjectRequestRef projectRef) throws ExternalOperationFailedException { + try { + Map params = new HashMap<>(); + params.put("project", projectRef.getName()); + params.put("delete", "true"); + String responseContent =SSORequestWTSS.requestWTSSWithSSOGet(this.managerUrl,params,this.schedulisProjectService.getSSORequestService(),projectRef.getWorkspace()); + LOGGER.info(" deleteWtssProject --response-->{}",responseContent); + } catch (Exception e){ + SchedulisExceptionUtils.dealErrorException(60052, "failed to delete project in schedulis", e, + ExternalOperationFailedException.class); + } + return new SchedulisProjectResponseRef(); + } + +} diff --git a/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/operation/SchedulisProjectUpdateOperation.java b/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/operation/SchedulisProjectUpdateOperation.java new file mode 100644 index 000000000..5ce09355c --- /dev/null +++ b/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/operation/SchedulisProjectUpdateOperation.java @@ -0,0 +1,53 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.schedulis.operation; + +import com.webank.wedatasphere.dss.appconn.schedulis.service.SchedulisProjectService; +import com.webank.wedatasphere.dss.standard.app.structure.StructureService; +import com.webank.wedatasphere.dss.standard.app.structure.project.ProjectRequestRef; +import com.webank.wedatasphere.dss.standard.app.structure.project.ProjectResponseRef; +import com.webank.wedatasphere.dss.standard.app.structure.project.ProjectService; +import com.webank.wedatasphere.dss.standard.app.structure.project.ProjectUpdateOperation; +import com.webank.wedatasphere.dss.standard.common.exception.operation.ExternalOperationFailedException; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; + +public class SchedulisProjectUpdateOperation implements ProjectUpdateOperation { + + + private static final Logger LOGGER = LoggerFactory.getLogger(SchedulisProjectUpdateOperation.class); + + private ProjectService schedulisProjectService; + + public SchedulisProjectUpdateOperation(){ + + } + + @Override + public void init() { + } + + @Override + public void setStructureService(StructureService service) { + this.schedulisProjectService = (SchedulisProjectService) service; + } + + @Override + public ProjectResponseRef updateProject(ProjectRequestRef projectRef) { + return null; + } +} diff --git a/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/parser/AzkabanWorkflowParser.java b/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/parser/AzkabanWorkflowParser.java new file mode 100644 index 000000000..0c0377c1f --- /dev/null +++ b/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/parser/AzkabanWorkflowParser.java @@ -0,0 +1,94 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.schedulis.parser; + +import com.google.gson.JsonObject; +import com.webank.wedatasphere.dss.appconn.schedulis.entity.AzkabanWorkflow; +import com.webank.wedatasphere.dss.common.entity.node.DSSNodeDefault; +import com.webank.wedatasphere.dss.common.exception.DSSRuntimeException; +import com.webank.wedatasphere.dss.workflow.core.entity.Workflow; +import com.webank.wedatasphere.dss.workflow.core.entity.WorkflowNode; +import com.webank.wedatasphere.dss.workflow.core.entity.WorkflowNodeEdge; +import com.webank.wedatasphere.dss.workflow.core.entity.WorkflowNodeImpl; +import com.webank.wedatasphere.dss.workflow.core.json2flow.parser.WorkflowParser; +import java.util.ArrayList; +import java.util.HashMap; +import java.util.List; +import java.util.Map; +import org.apache.commons.beanutils.BeanUtils; + +public class AzkabanWorkflowParser implements WorkflowParser { + + @Override + public Workflow parse(JsonObject flowJson, Workflow workflow) { + AzkabanWorkflow azkabanWorkflow = new AzkabanWorkflow(); + try { + BeanUtils.copyProperties(azkabanWorkflow, workflow); + } catch (Exception e) { + throw new DSSRuntimeException(91500, "Copy workflow fields failed!", e); + } + return addEndNodeForFlowName(azkabanWorkflow); + } + + private AzkabanWorkflow addEndNodeForFlowName(AzkabanWorkflow flow) { + DSSNodeDefault endNode = new DSSNodeDefault(); + List endNodeList = getFlowEndJobList(flow); + if(flow.getRootFlow()){ + endNode.setId(flow.getName()); + endNode.setName(flow.getName()); + }else{ + endNode.setId(flow.getName() + "_"); + endNode.setName(flow.getName() + "_"); + } + endNode.setNodeType("linkis.control.empty"); + Map jobContentMap = new HashMap<>(); + endNode.setJobContent(jobContentMap); + if (!endNodeList.isEmpty()) { + if(endNodeList.size() == 1 ) { + if(endNodeList.get(0).getName().equals(flow.getName())){ + return flow; + } + } + endNodeList.forEach(tmpNode -> endNode.addDependency(tmpNode.getName())); + WorkflowNode azkabanSchedulerNode = new WorkflowNodeImpl(); + azkabanSchedulerNode.setDSSNode(endNode); + flow.getWorkflowNodes().add(azkabanSchedulerNode); + } + return flow; + } + + private List getFlowEndJobList(AzkabanWorkflow flow) { + List res = new ArrayList<>(); + for (WorkflowNode job : flow.getWorkflowNodes()) { + int flag = 0; + for (WorkflowNodeEdge link : flow.getWorkflowNodeEdges()) { + if (job.getId().equals(link.getDSSEdge().getSource())) { + flag = 1; + } + } + if (flag == 0) { + res.add(job); + } + } + return res; + } + + @Override + public int getOrder() { + return 100; + } +} diff --git a/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/ref/SchedulisProjectResponseRef.java b/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/ref/SchedulisProjectResponseRef.java new file mode 100644 index 000000000..db0f7c99f --- /dev/null +++ b/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/ref/SchedulisProjectResponseRef.java @@ -0,0 +1,83 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.schedulis.ref; + +import com.webank.wedatasphere.dss.appconn.schedulis.conf.SchedulisConf; +import com.webank.wedatasphere.dss.standard.app.structure.project.ProjectResponseRef; +import com.webank.wedatasphere.dss.standard.common.desc.AppInstance; +import com.webank.wedatasphere.dss.standard.common.entity.ref.AbstractResponseRef; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; + +import java.util.HashMap; +import java.util.Map; + +public class SchedulisProjectResponseRef extends AbstractResponseRef implements ProjectResponseRef { + + + private static final Logger LOGGER = LoggerFactory.getLogger(SchedulisProjectResponseRef.class); + + private String errorMsg; + + private Long projectRefId; + + public SchedulisProjectResponseRef(){ + super("", 0); + } + + + public SchedulisProjectResponseRef(String responseBody, int status, String errorMsg) { + super(responseBody, status); + this.errorMsg = errorMsg; + } + + @SuppressWarnings("unchecked") + @Override + public Map toMap() { + try{ + return SchedulisConf.gson().fromJson(this.getResponseBody(), Map.class); + }catch(Exception e){ + LOGGER.error("failed to covert {} to a map", this.getResponseBody(), e); + return new HashMap(); + } + } + + @Override + public String getErrorMsg() { + return this.errorMsg; + } + + public void setErrorMsg(String errorMsg){ + this.errorMsg = errorMsg; + } + + + @Override + public Long getProjectRefId() { + return this.projectRefId; + } + + @Override + public Map getProjectRefIds() { + return null; + } + + public void setProjectRefId(Long projectRefId){ + this.projectRefId = projectRefId; + } + +} diff --git a/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/service/AzkabanUserService.java b/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/service/AzkabanUserService.java new file mode 100644 index 000000000..d84241f64 --- /dev/null +++ b/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/service/AzkabanUserService.java @@ -0,0 +1,72 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.schedulis.service; + +import com.google.gson.Gson; +import com.webank.wedatasphere.dss.appconn.schedulis.entity.AzkabanUserEntity; +import com.webank.wedatasphere.dss.appconn.schedulis.utils.SSORequestWTSS; +import com.webank.wedatasphere.dss.common.utils.DSSCommonUtils; +import com.webank.wedatasphere.dss.standard.app.sso.Workspace; +import com.webank.wedatasphere.dss.standard.app.sso.request.SSORequestService; +import com.webank.wedatasphere.linkis.common.conf.CommonVars; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; + +import java.util.HashMap; +import java.util.List; +import java.util.Map; + +public class AzkabanUserService { + + private static Map schedulisUserMap = new HashMap<>(); + + private static final Logger LOGGER = LoggerFactory.getLogger(AzkabanUserService.class); + private static final String ADMIN_USER = CommonVars.apply("wds.dss.schedulis.admin.user", "superadmin").getValue(); + + private static void requestUserId(String baseUrl, SSORequestService ssoRequestService, Workspace workspace) { + try { + Map params = new HashMap<>(); + params.put("page", "1"); + params.put("pageSize", "10000"); + params.put("ajax","loadSystemUserSelectData"); + baseUrl = !baseUrl.endsWith("/") ? (baseUrl + "/") : baseUrl; + String finalUrl = baseUrl + "system"; + LOGGER.info("Request User info from wtss url: "+finalUrl); + String response = SSORequestWTSS.requestWTSSWithSSOGet(finalUrl,params,ssoRequestService,workspace); + Map map = DSSCommonUtils.COMMON_GSON.fromJson(response, Map.class); + if (map.get("systemUserList") instanceof List){ + ((List) map.get("systemUserList")).forEach(e -> { + AzkabanUserEntity entity = new Gson().fromJson(e.toString(), AzkabanUserEntity.class); + schedulisUserMap.put(entity.getUsername(),entity.getId()); + }); + } + } catch (Exception e) { + LOGGER.error("get user from wtss failed。", e); + } + + } + + public static String getUserIdByName(String username, String baseUrl, SSORequestService ssoRequestService, Workspace workspace) { + if(schedulisUserMap.containsKey(username)) { + return schedulisUserMap.get(username); + }else{ + requestUserId(baseUrl,ssoRequestService,workspace); + return schedulisUserMap.get(username); + } + + } +} diff --git a/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/service/SchedulisProjectService.java b/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/service/SchedulisProjectService.java new file mode 100644 index 000000000..6325a9dbf --- /dev/null +++ b/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/service/SchedulisProjectService.java @@ -0,0 +1,52 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.schedulis.service; + +import com.webank.wedatasphere.dss.appconn.schedulis.operation.SchedulisProjectCreationOperation; +import com.webank.wedatasphere.dss.appconn.schedulis.operation.SchedulisProjectDeletionOperation; +import com.webank.wedatasphere.dss.appconn.schedulis.operation.SchedulisProjectUpdateOperation; +import com.webank.wedatasphere.dss.standard.app.structure.project.ProjectCreationOperation; +import com.webank.wedatasphere.dss.standard.app.structure.project.ProjectDeletionOperation; +import com.webank.wedatasphere.dss.standard.app.structure.project.ProjectService; +import com.webank.wedatasphere.dss.standard.app.structure.project.ProjectUpdateOperation; +import com.webank.wedatasphere.dss.standard.app.structure.project.ProjectUrlOperation; + +public class SchedulisProjectService extends ProjectService { + + public SchedulisProjectService(){ + } + + @Override + protected ProjectCreationOperation createProjectCreationOperation() { + return new SchedulisProjectCreationOperation(); + } + + @Override + protected ProjectUpdateOperation createProjectUpdateOperation() { + return new SchedulisProjectUpdateOperation(); + } + + @Override + protected ProjectDeletionOperation createProjectDeletionOperation() { + return new SchedulisProjectDeletionOperation(); + } + + @Override + protected ProjectUrlOperation createProjectUrlOperation() { + return null; + } +} diff --git a/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/sso/SchedulisHttpGet.java b/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/sso/SchedulisHttpGet.java new file mode 100644 index 000000000..0eb88e4e8 --- /dev/null +++ b/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/sso/SchedulisHttpGet.java @@ -0,0 +1,33 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.schedulis.sso; + +import org.apache.http.client.methods.HttpGet; + +public class SchedulisHttpGet extends HttpGet implements UserInfo { + + private String user; + + public SchedulisHttpGet(String url, String user){ + super(url); + this.user = user; + } + @Override + public String getUser() { + return user; + } +} diff --git a/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/sso/SchedulisHttpPost.java b/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/sso/SchedulisHttpPost.java new file mode 100644 index 000000000..3782e5605 --- /dev/null +++ b/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/sso/SchedulisHttpPost.java @@ -0,0 +1,36 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.schedulis.sso; + +import org.apache.http.client.methods.HttpPost; + +public class SchedulisHttpPost extends HttpPost implements UserInfo{ + + private String user; + + public SchedulisHttpPost(String url, String user){ + super(url); + this.user = user; + } + + @Override + public String getUser() { + return user; + } + + +} diff --git a/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/sso/SchedulisSecurityService.java b/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/sso/SchedulisSecurityService.java new file mode 100644 index 000000000..0bdc6cdb4 --- /dev/null +++ b/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/sso/SchedulisSecurityService.java @@ -0,0 +1,176 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.schedulis.sso; + + +import com.google.common.cache.Cache; +import com.google.common.cache.CacheBuilder; +import com.webank.wedatasphere.dss.appconn.schedulis.conf.AzkabanConf; +import com.webank.wedatasphere.linkis.common.exception.ErrorException; +import com.webank.wedatasphere.linkis.common.utils.Utils; +import org.apache.commons.io.IOUtils; +import org.apache.http.HttpEntity; +import org.apache.http.NameValuePair; +import org.apache.http.client.CookieStore; +import org.apache.http.client.entity.UrlEncodedFormEntity; +import org.apache.http.client.methods.CloseableHttpResponse; +import org.apache.http.client.methods.HttpPost; +import org.apache.http.client.protocol.HttpClientContext; +import org.apache.http.cookie.Cookie; +import org.apache.http.impl.client.BasicCookieStore; +import org.apache.http.impl.client.CloseableHttpClient; +import org.apache.http.impl.client.HttpClients; +import org.apache.http.message.BasicNameValuePair; +import org.apache.http.util.EntityUtils; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; + +import java.io.IOException; +import java.util.ArrayList; +import java.util.List; +import java.util.Optional; +import java.util.Properties; +import java.util.concurrent.TimeUnit; + + +public final class SchedulisSecurityService { + + private static Logger LOGGER = LoggerFactory.getLogger(SchedulisSecurityService.class); + + + private String securityUrl = ""; + private static final String USER_NAME_KEY = "username"; + private static final String USER_TOKEN_KEY = AzkabanConf.AZKABAN_LOGIN_PWD.getValue(); + private static final String SESSION_ID_KEY = "azkaban.browser.session.id"; + private static Properties userToken; + private static final String USER_STR = "username"; + + private static final String USER_RSA = AzkabanConf.AZKABAN_RSA.getValue(); + private static final String USER_SECRET = "dss_secret"; + + private static final String CIPHER_STR = "userpwd"; + + + private static final String SUPER_USER = "dws-wtss"; + private static final String SUPER_USER_CIPHER = "WeBankBDPWTSS&DWS@2019"; + + private static final String SUPER_USER_STR = "superUser"; + private static final String SUPER_USER_CIPHER_STR = "superUserPwd"; + + + private Cache cookieCache = CacheBuilder.newBuilder() + .expireAfterAccess(30 * 60, TimeUnit.SECONDS) + .build(); + + private static SchedulisSecurityService instance; + + private SchedulisSecurityService(String baseUrl) { + this.securityUrl = baseUrl.endsWith("/") ? baseUrl + "checkin" : baseUrl + "/checkin"; + } + + public static SchedulisSecurityService getInstance(String baseUrl) { + if (null == instance) { + synchronized (SchedulisSecurityService.class) { + if (null == instance) { + instance = new SchedulisSecurityService(baseUrl); + } + } + } + return instance; + } + + static { + Utils.defaultScheduler().scheduleAtFixedRate(()->{ + LOGGER.info("开始读取用户token文件"); + Properties properties = new Properties(); + try { + properties.load(SchedulisSecurityService.class.getClassLoader().getResourceAsStream("token.properties")); + userToken = properties; + } catch (IOException e) { + LOGGER.error("读取文件失败:",e); + } + },0,10, TimeUnit.MINUTES); + } + + + public Cookie login(String user) throws Exception { + synchronized (user.intern()) { + Cookie session = cookieCache.getIfPresent(user); + if (session != null) { + return session; + } + Cookie newCookie = getCookie(user, getUserToken(user)); + cookieCache.put(user, newCookie); + return newCookie; + } + } + + private String getUserToken(String user) { + //直接从配置文件中读取,有需求可以自己实现 + Object token = userToken.get(user); + if (token == null) { + return ""; + } + return token.toString(); + } + + + private Cookie getCookie(String user, String token) throws Exception { + HttpPost httpPost = new HttpPost(securityUrl); + List params = new ArrayList<>(); + params.add(new BasicNameValuePair(USER_NAME_KEY, user)); + params.add(new BasicNameValuePair(USER_TOKEN_KEY, token)); + params.add(new BasicNameValuePair(USER_STR, user)); + params.add(new BasicNameValuePair(CIPHER_STR, token)); + + params.add(new BasicNameValuePair(SUPER_USER_STR, SUPER_USER)); + params.add(new BasicNameValuePair(SUPER_USER_CIPHER_STR, SUPER_USER_CIPHER)); + params.add(new BasicNameValuePair(USER_SECRET, USER_RSA)); + + params.add(new BasicNameValuePair("action", "login")); + httpPost.setEntity(new UrlEncodedFormEntity(params)); + CookieStore cookieStore = new BasicCookieStore(); + CloseableHttpClient httpClient = null; + CloseableHttpResponse response = null; + HttpClientContext context; + String responseContent; + try { + httpClient = HttpClients.custom().setDefaultCookieStore(cookieStore).build(); + context = HttpClientContext.create(); + response = httpClient.execute(httpPost, context); + HttpEntity entity = response.getEntity(); + responseContent = EntityUtils.toString(entity, "utf-8"); + LOGGER.info("Get azkaban response code is " + response.getStatusLine().getStatusCode() + ",response: " + responseContent); + if (response.getStatusLine().getStatusCode() != 200) { + throw new ErrorException(90041, responseContent); + } + + } finally { + IOUtils.closeQuietly(response); + IOUtils.closeQuietly(httpClient); + } + List cookies = context.getCookieStore().getCookies(); + Optional cookie = cookies.stream().filter(this::findSessionId).findAny(); + return cookie.orElseThrow(() -> new ErrorException(90041, "Get azkaban session is null : " + responseContent)); + } + + private boolean findSessionId(Cookie cookie) { + return SESSION_ID_KEY.equals(cookie.getName()); + } + + +} diff --git a/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/sso/UserInfo.java b/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/sso/UserInfo.java new file mode 100644 index 000000000..e9285a86c --- /dev/null +++ b/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/sso/UserInfo.java @@ -0,0 +1,24 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.schedulis.sso; + + +public interface UserInfo { + + String getUser(); + +} diff --git a/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/standard/SchedulisStructureStandard.java b/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/standard/SchedulisStructureStandard.java new file mode 100644 index 000000000..e28702086 --- /dev/null +++ b/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/standard/SchedulisStructureStandard.java @@ -0,0 +1,49 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.schedulis.standard; + +import com.webank.wedatasphere.dss.appconn.schedulis.service.SchedulisProjectService; +import com.webank.wedatasphere.dss.standard.app.structure.AbstractStructureIntegrationStandard; +import com.webank.wedatasphere.dss.standard.app.structure.project.ProjectService; + + +/** + * Schedulis's engineering integration specification is a singleton. + */ +public class SchedulisStructureStandard extends AbstractStructureIntegrationStandard { + + private volatile static SchedulisStructureStandard instance; + + private SchedulisStructureStandard(){ + } + + public static SchedulisStructureStandard getInstance(){ + if(instance == null){ + synchronized (SchedulisStructureStandard.class){ + if (instance == null){ + instance = new SchedulisStructureStandard(); + } + } + } + return instance; + } + + @Override + protected ProjectService createProjectService() { + return new SchedulisProjectService(); + } +} diff --git a/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/utils/AzkabanUtils.java b/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/utils/AzkabanUtils.java new file mode 100644 index 000000000..2ecbe9b2d --- /dev/null +++ b/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/utils/AzkabanUtils.java @@ -0,0 +1,66 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.schedulis.utils; + +import com.google.gson.Gson; +import org.apache.commons.lang.StringUtils; + +import java.io.IOException; +import java.util.Map; + +public class AzkabanUtils { + public static String handleAzkabanEntity(String entityString) throws IOException { + Gson gson = new Gson(); + Object object = gson.fromJson(entityString, Object.class); + String status = null; + String message = null; + if (object instanceof Map) { + Map map = (Map) object; + if (map.get("status") != null) { + status = map.get("status").toString(); + } + if (StringUtils.isNotEmpty(status)) { + if (null != map.get("message")) { + message = map.get("message").toString(); + } + } + if ("error".equalsIgnoreCase(status)) { + return message; + } + } + return "success"; + } + + public static String getValueFromEntity(String entityString, String searchKey) throws IOException { + Gson gson = new Gson(); + Object object = gson.fromJson(entityString, Object.class); + String status = null; + String valueStr = null; + if (object instanceof Map) { + Map map = (Map) object; + if (map.get("status") != null) { + status = map.get("status").toString(); + } + if (StringUtils.isNotEmpty(status) && status.equals("success")) { + if (null != map.get(searchKey)) { + valueStr = map.get(searchKey).toString(); + } + } + } + return valueStr; + } +} diff --git a/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/utils/AzkabanUtilsScala.scala b/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/utils/AzkabanUtilsScala.scala new file mode 100644 index 000000000..7bff6817f --- /dev/null +++ b/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/utils/AzkabanUtilsScala.scala @@ -0,0 +1,30 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.schedulis.utils + +import java.util +import scala.collection.JavaConversions._ +import scala.collection.JavaConverters._ + +object AzkabanUtilsScala { + + def getRepeatNodeName(nodeList:java.util.List[String]):util.List[String]={ + val res = nodeList.map(x=>(x,1)).groupBy(x => x._1).map(x =>(x._1,x._2.size)).filter(x=>x._2 >1).map(x=>x._1).toList + res.asJava + } + +} diff --git a/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/utils/SSORequestWTSS.java b/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/utils/SSORequestWTSS.java new file mode 100644 index 000000000..08f99fb60 --- /dev/null +++ b/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/utils/SSORequestWTSS.java @@ -0,0 +1,162 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.schedulis.utils; + +import static com.webank.wedatasphere.dss.appconn.schedulis.SchedulisAppConn.SCHEDULIS_APPCONN_NAME; + +import com.webank.wedatasphere.dss.appconn.schedulis.Action.FlowScheduleAction; +import com.webank.wedatasphere.dss.appconn.schedulis.Action.FlowScheduleGetAction; +import com.webank.wedatasphere.dss.appconn.schedulis.Action.FlowScheduleUploadAction; +import com.webank.wedatasphere.dss.standard.app.sso.Workspace; +import com.webank.wedatasphere.dss.standard.app.sso.builder.SSOUrlBuilderOperation; +import com.webank.wedatasphere.dss.standard.app.sso.request.SSORequestOperation; +import com.webank.wedatasphere.dss.standard.app.sso.request.SSORequestService; +import com.webank.wedatasphere.dss.standard.common.app.AppIntegrationService; +import com.webank.wedatasphere.dss.standard.common.exception.AppStandardErrorException; +import com.webank.wedatasphere.dss.standard.common.exception.operation.ExternalOperationFailedException; +import com.webank.wedatasphere.linkis.httpclient.request.HttpAction; +import com.webank.wedatasphere.linkis.httpclient.response.HttpResult; +import java.util.List; +import java.util.Map; +import org.apache.commons.io.IOUtils; +import org.apache.http.Consts; +import org.apache.http.HttpEntity; +import org.apache.http.NameValuePair; +import org.apache.http.client.CookieStore; +import org.apache.http.client.entity.EntityBuilder; +import org.apache.http.client.methods.CloseableHttpResponse; +import org.apache.http.client.methods.HttpPost; +import org.apache.http.cookie.Cookie; +import org.apache.http.entity.ContentType; +import org.apache.http.impl.client.BasicCookieStore; +import org.apache.http.impl.client.CloseableHttpClient; +import org.apache.http.impl.client.HttpClients; +import org.apache.http.protocol.HTTP; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; + + +public class SSORequestWTSS { + + private final static Logger logger = LoggerFactory.getLogger(SSORequestWTSS.class); + + public static String requestWTSS(String url, String username, List params) throws Exception { + HttpPost httpPost = new HttpPost(url); + Cookie cookie = null; + CookieStore cookieStore = new BasicCookieStore(); + cookieStore.addCookie(cookie); + HttpEntity entity = EntityBuilder.create(). + setContentType(ContentType.create("application/x-www-form-urlencoded", Consts.UTF_8)) + .setParameters(params).build(); + httpPost.setEntity(entity); + CloseableHttpClient httpClient = null; + CloseableHttpResponse response = null; + try { + httpClient = HttpClients.custom().setDefaultCookieStore(cookieStore).build(); + response = httpClient.execute(httpPost); + HttpEntity ent = response.getEntity(); + String entStr = IOUtils.toString(ent.getContent(), "utf-8"); + return entStr; + } catch (Exception e) { + logger.error("requestWTSSError-->", e); + throw e; + } finally { + IOUtils.closeQuietly(response); + IOUtils.closeQuietly(httpClient); + } + } + + + public static String requestWTSSWithSSOPost( String url, + Map params, + AppIntegrationService service, + Workspace workspace + ) throws Exception { + + try { + + FlowScheduleAction flowScheduleAction = new FlowScheduleAction(); + flowScheduleAction.getFormParams().putAll(params); + SSOUrlBuilderOperation ssoUrlBuilderOperation = workspace.getSSOUrlBuilderOperation().copy(); + ssoUrlBuilderOperation.setAppName(SCHEDULIS_APPCONN_NAME); + ssoUrlBuilderOperation.setReqUrl(url); + ssoUrlBuilderOperation.setWorkspace(workspace.getWorkspaceName()); + flowScheduleAction.setURL(ssoUrlBuilderOperation.getBuiltUrl()); + SSORequestOperation ssoRequestOperation = service.getSSORequestService().createSSORequestOperation(SCHEDULIS_APPCONN_NAME); + HttpResult previewResult = ssoRequestOperation.requestWithSSO(ssoUrlBuilderOperation, flowScheduleAction); + if (previewResult.getStatusCode() == 200 || previewResult.getStatusCode() == 0) { + String response = previewResult.getResponseBody(); + return response; + }else{ + throw new ExternalOperationFailedException(50063,"User sso request failed:"+url); + } + + } catch (Exception e) { + logger.error("requestWTSSPostError-->", e); + throw e; + } + } + + public static String requestWTSSWithSSOGet( String url, + Map params, + SSORequestService service, + Workspace Workspace + ) throws Exception { + + try { + + FlowScheduleGetAction flowScheduleGetAction = new FlowScheduleGetAction(); + flowScheduleGetAction.getParameters().putAll(params); + SSOUrlBuilderOperation ssoUrlBuilderOperation =Workspace.getSSOUrlBuilderOperation().copy(); + ssoUrlBuilderOperation.setAppName(SCHEDULIS_APPCONN_NAME); + ssoUrlBuilderOperation.setReqUrl(url); + ssoUrlBuilderOperation.setWorkspace(Workspace.getWorkspaceName()); + flowScheduleGetAction.setURL(ssoUrlBuilderOperation.getBuiltUrl()); + SSORequestOperation ssoRequestOperation=service.createSSORequestOperation(SCHEDULIS_APPCONN_NAME); + HttpResult previewResult = ssoRequestOperation.requestWithSSO(ssoUrlBuilderOperation, flowScheduleGetAction); + String response = previewResult.getResponseBody(); + return response; + + } catch (Exception e) { + logger.error("requestWTSSGetError-->", e); + throw e; + } + } + + + public static HttpResult requestWTSSWithSSOUpload(String url, + FlowScheduleUploadAction uploadAction, + SSORequestService service, + Workspace Workspace) throws AppStandardErrorException { + HttpResult previewResult= null; + try { + + SSOUrlBuilderOperation ssoUrlBuilderOperation = Workspace.getSSOUrlBuilderOperation().copy(); + ssoUrlBuilderOperation.setAppName(SCHEDULIS_APPCONN_NAME); + ssoUrlBuilderOperation.setReqUrl(url); + ssoUrlBuilderOperation.setWorkspace(Workspace.getWorkspaceName()); + uploadAction.setURl(ssoUrlBuilderOperation.getBuiltUrl()); + SSORequestOperation ssoRequestOperation = service.createSSORequestOperation(SCHEDULIS_APPCONN_NAME); + previewResult = ssoRequestOperation.requestWithSSO(ssoUrlBuilderOperation, uploadAction); + + }catch (AppStandardErrorException e){ + logger.error("uploadWTSSGetError-->", e); + throw e; + } + return previewResult; + } +} diff --git a/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/utils/SchedulisExceptionUtils.java b/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/utils/SchedulisExceptionUtils.java new file mode 100644 index 000000000..c6b66c94d --- /dev/null +++ b/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/utils/SchedulisExceptionUtils.java @@ -0,0 +1,59 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.schedulis.utils; + +import com.webank.wedatasphere.linkis.common.exception.ErrorException; + +import java.lang.reflect.Constructor; + +public class SchedulisExceptionUtils { + + public static void dealErrorException(int errorCode, String errorDesc,Throwable throwable, + Class clazz) throws T{ + T errorException = null; + try { + Constructor constructor = clazz.getConstructor(int.class, String.class, Throwable.class); + errorException = constructor.newInstance(errorCode, errorDesc, throwable); + errorException.setErrCode(errorCode); + errorException.setDesc(errorDesc); + } catch (Exception e) { + throw new RuntimeException(String.format("failed to instance %s", clazz.getName()), e); + } + errorException.initCause(throwable); + throw errorException; + } + + + public static void dealErrorException(int errorCode, String errorDesc, + Class clazz) throws T{ + T errorException = null; + try { + Constructor constructor = clazz.getConstructor(int.class, String.class); + errorException = constructor.newInstance(errorCode, errorDesc); + errorException.setErrCode(errorCode); + errorException.setDesc(errorDesc); + } catch (Exception e) { + throw new RuntimeException(String.format("failed to instance %s", clazz.getName()), e); + } + throw errorException; + } + + + + + +} diff --git a/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/utils/SchedulisUtils.java b/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/utils/SchedulisUtils.java new file mode 100644 index 000000000..3f2030319 --- /dev/null +++ b/dss-appconn/appconns/dss-schedulis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/schedulis/utils/SchedulisUtils.java @@ -0,0 +1,49 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.schedulis.utils; + +import com.google.gson.Gson; +import org.apache.commons.lang.StringUtils; + +import java.util.Map; + +public class SchedulisUtils { + + + public static String handleSchedulisResponse(String response){ + Gson gson = new Gson(); + Object object = gson.fromJson(response, Object.class); + String status = null; + String message = null; + if (object instanceof Map) { + Map map = (Map) object; + if (map.get("status") != null) { + status = map.get("status").toString(); + } + if (StringUtils.isNotEmpty(status)) { + if (null != map.get("message")) { + message = map.get("message").toString(); + } + } + if ("error".equalsIgnoreCase(status)) { + return message; + } + } + return "success"; + } + +} diff --git a/dss-appconn/appconns/dss-schedulis-appconn/src/main/resources/appconn.properties b/dss-appconn/appconns/dss-schedulis-appconn/src/main/resources/appconn.properties new file mode 100644 index 000000000..19365e2b5 --- /dev/null +++ b/dss-appconn/appconns/dss-schedulis-appconn/src/main/resources/appconn.properties @@ -0,0 +1,20 @@ +# +# Copyright 2019 WeBank +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# +# + + + + + diff --git a/dss-appconn/appconns/dss-schedulis-appconn/src/main/resources/init.sql b/dss-appconn/appconns/dss-schedulis-appconn/src/main/resources/init.sql new file mode 100644 index 000000000..7a3b8ba17 --- /dev/null +++ b/dss-appconn/appconns/dss-schedulis-appconn/src/main/resources/init.sql @@ -0,0 +1,19 @@ +select @dss_appconn_schedulisId:=id from `dss_appconn` where `appconn_name` = 'schedulis'; + +delete from `dss_appconn_instance` where `appconn_id`=@dss_appconn_schedulisId; +delete from `dss_appconn` where `appconn_name`='schedulis'; + +INSERT INTO `dss_appconn` (`appconn_name`, `is_user_need_init`, `level`, `if_iframe`, `is_external`, `reference`, `class_name`, `appconn_class_path`, `resource`) +VALUES ('schedulis', 0, 1, NULL, 0, NULL, 'com.webank.wedatasphere.dss.appconn.schedulis.SchedulisAppConn', 'DSS_INSTALL_HOME_VAL/dss-appconns/schedulis/lib', ''); + +select @dss_appconn_schedulisId:=id from `dss_appconn` where `appconn_name` = 'schedulis'; +insert into `dss_appconn_instance` (`appconn_id`, `label`, `url`, `enhance_json`, `homepage_url`, `redirect_url`) values(@dss_appconn_schedulisId,'DEV','http://APPCONN_INSTALL_IP:APPCONN_INSTALL_PORT/','','http://APPCONN_INSTALL_IP:APPCONN_INSTALL_PORT/','http://APPCONN_INSTALL_IP:APPCONN_INSTALL_PORT/'); + +delete from `dss_application` WHERE `name` ='schedulis'; +INSERT INTO `dss_application`(`name`,`url`,`is_user_need_init`,`level`,`user_init_url`,`exists_project_service`,`project_url`,`enhance_json`,`if_iframe`,`homepage_url`,`redirect_url`) +VALUES ('schedulis','http://APPCONN_INSTALL_IP:APPCONN_INSTALL_PORT',0,1,NULL,0,'http://APPCONN_INSTALL_IP:APPCONN_INSTALL_PORT/manager?project=${projectName}','{\"scheduleHistory\":\"http://APPCONN_INSTALL_IP:APPCONN_INSTALL_PORT/manager?project=${projectName}&flow=${flowName}&hideHead=true#executions\"}',1,'http://APPCONN_INSTALL_IP:APPCONN_INSTALL_PORT/homepage','http://APPCONN_INSTALL_IP:APPCONN_INSTALL_PORT/api/v1/redirect'); + +select @dss_schedulis_applicationId:=id from `dss_application` WHERE `name` ='schedulis'; +delete from `dss_onestop_menu_application` WHERE title_en='Schedulis'; +INSERT INTO `dss_onestop_menu_application` (`application_id`, `onestop_menu_id`, `title_en`, `title_cn`, `desc_en`, `desc_cn`, `labels_en`, `labels_cn`, `is_active`, `access_button_en`, `access_button_cn`, `manual_button_en`, `manual_button_cn`, `manual_button_url`, `icon`, `order`, `create_by`, `create_time`, `last_update_time`, `last_update_user`, `image`) +VALUES(@dss_schedulis_applicationId,'3','Schedulis','Schedulis','Description for Schedulis.','Schedulis描述','scheduling, workflow','调度,工作流','1','enter Schedulis','进入Schedulis','user manual','用户手册','http://127.0.0.1:8088/wiki/scriptis/manual/workspace_cn.html','diaoduxitong-logo',NULL,NULL,NULL,NULL,NULL,'diaoduxitong-icon'); diff --git a/dss-appconn/appconns/dss-schedulis-appconn/src/main/resources/log4j.properties b/dss-appconn/appconns/dss-schedulis-appconn/src/main/resources/log4j.properties new file mode 100644 index 000000000..ee8619595 --- /dev/null +++ b/dss-appconn/appconns/dss-schedulis-appconn/src/main/resources/log4j.properties @@ -0,0 +1,36 @@ +# +# Copyright 2019 WeBank +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# +# + +### set log levels ### + +log4j.rootCategory=INFO,console + +log4j.appender.console=org.apache.log4j.ConsoleAppender +log4j.appender.console.Threshold=INFO +log4j.appender.console.layout=org.apache.log4j.PatternLayout +#log4j.appender.console.layout.ConversionPattern= %d{ISO8601} %-5p (%t) [%F:%M(%L)] - %m%n +log4j.appender.console.layout.ConversionPattern= %d{ISO8601} %-5p (%t) %p %c{1} - %m%n + + +log4j.appender.com.webank.bdp.ide.core=org.apache.log4j.DailyRollingFileAppender +log4j.appender.com.webank.bdp.ide.core.Threshold=INFO +log4j.additivity.com.webank.bdp.ide.core=false +log4j.appender.com.webank.bdp.ide.core.layout=org.apache.log4j.PatternLayout +log4j.appender.com.webank.bdp.ide.core.Append=true +log4j.appender.com.webank.bdp.ide.core.File=logs/linkis.log +log4j.appender.com.webank.bdp.ide.core.layout.ConversionPattern= %d{ISO8601} %-5p (%t) [%F:%M(%L)] - %m%n + +log4j.logger.org.springframework=INFO diff --git a/dss-appconn/appconns/dss-schedulis-appconn/src/main/resources/log4j2.xml b/dss-appconn/appconns/dss-schedulis-appconn/src/main/resources/log4j2.xml new file mode 100644 index 000000000..8c40a73e8 --- /dev/null +++ b/dss-appconn/appconns/dss-schedulis-appconn/src/main/resources/log4j2.xml @@ -0,0 +1,38 @@ + + + + + + + + + + + + + + + + + + + + + + + diff --git a/dss-appconn/appconns/dss-schedulis-appconn/src/main/scala/com/webank/wedatasphere/dss/appconn/schedulis/conf/SchedulisConf.scala b/dss-appconn/appconns/dss-schedulis-appconn/src/main/scala/com/webank/wedatasphere/dss/appconn/schedulis/conf/SchedulisConf.scala new file mode 100644 index 000000000..d54d7edc4 --- /dev/null +++ b/dss-appconn/appconns/dss-schedulis-appconn/src/main/scala/com/webank/wedatasphere/dss/appconn/schedulis/conf/SchedulisConf.scala @@ -0,0 +1,30 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.schedulis.conf + +import java.lang +import java.lang.reflect.Type + +import com.google.gson.{Gson, GsonBuilder, JsonElement, JsonPrimitive, JsonSerializationContext, JsonSerializer} + +object SchedulisConf { + implicit val gson:Gson = new GsonBuilder().setPrettyPrinting().setDateFormat("yyyy-MM-dd'T'HH:mm:ssZ").serializeNulls + .registerTypeAdapter(classOf[java.lang.Double], new JsonSerializer[java.lang.Double] { + override def serialize(t: lang.Double, `type`: Type, jsonSerializationContext: JsonSerializationContext): JsonElement = + if(t == t.longValue()) new JsonPrimitive(t.longValue()) else new JsonPrimitive(t) + }).create +} diff --git a/dss-appconn/appconns/dss-schedulis-appconn/src/main/scala/com/webank/wedatasphere/dss/appconn/schedulis/http/SchedulisHttpAction.scala b/dss-appconn/appconns/dss-schedulis-appconn/src/main/scala/com/webank/wedatasphere/dss/appconn/schedulis/http/SchedulisHttpAction.scala new file mode 100644 index 000000000..41dd239ff --- /dev/null +++ b/dss-appconn/appconns/dss-schedulis-appconn/src/main/scala/com/webank/wedatasphere/dss/appconn/schedulis/http/SchedulisHttpAction.scala @@ -0,0 +1,85 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.schedulis.http + +import java.io.{File, InputStream} +import java.util + +import com.webank.wedatasphere.dss.appconn.schedulis.conf.SchedulisConf +import com.webank.wedatasphere.linkis.httpclient.request.{GetAction, HttpAction, POSTAction, UploadAction, UserAction} + +trait SchedulisHttpAction extends UserAction{ + + private var user:String = _ + + override def setUser(user: String): Unit = this.user = user + + override def getUser: String = this.user + +} + +abstract class SchedulisGetAction extends GetAction with SchedulisHttpAction + + +abstract class ScheudlisPostAction extends POSTAction with SchedulisHttpAction{ + + override def getRequestPayload: String = SchedulisConf.gson.toJson(getRequestPayloads) + +} + + + + +case class SchedulisUploadAction(filePaths:Array[String], + _inputStreams:util.Map[String,InputStream],uploadUrl:String) extends ScheudlisPostAction with UploadAction with SchedulisHttpAction{ + + private val streamNames = new util.HashMap[String,String] + + override val files: util.Map[String, String] = { + if (null == filePaths || filePaths.length == 0) new util.HashMap[String,String]() else{ + val map = new java.util.HashMap[String, String] + filePaths foreach { + filePath => val arr = filePath.split(File.separator) + val fileName = arr(arr.length - 1) + map.put("file", filePath) + } + map + } + } + + override def inputStreams: util.Map[String, InputStream] = _inputStreams + + override def inputStreamNames: util.Map[String, String] = streamNames + + private var _user:String = _ + + override def setUser(user: String): Unit = this._user = user + + override def getUser: String = this._user + + override def getRequestPayload: String = "" + + override def getURL: String = uploadUrl +} + +class SchedulisCreateProjectAction(url:String) extends ScheudlisPostAction{ + + override def getURL: String = url + +} + + diff --git a/dss-appconn/appconns/dss-sendemail-appconn/pom.xml b/dss-appconn/appconns/dss-sendemail-appconn/pom.xml new file mode 100644 index 000000000..88969336c --- /dev/null +++ b/dss-appconn/appconns/dss-sendemail-appconn/pom.xml @@ -0,0 +1,36 @@ + + + + + + dss + com.webank.wedatasphere.dss + 1.0.0 + ../../pom.xml + + 4.0.0 + + dss-sendemail-appconn + pom + + + sendemail-appconn-core + + + \ No newline at end of file diff --git a/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/pom.xml b/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/pom.xml new file mode 100644 index 000000000..29ca4f2d5 --- /dev/null +++ b/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/pom.xml @@ -0,0 +1,153 @@ + + + + + + dss + com.webank.wedatasphere.dss + 1.0.0 + ../pom.xml + + 4.0.0 + + dss-sendemail-appconn-core + + + com.webank.wedatasphere.dss + dss-appconn-core + ${dss.version} + compile + + + + com.webank.wedatasphere.dss + dss-common + ${dss.version} + compile + + + + com.webank.wedatasphere.dss + dss-development-process-standard-execution + ${dss.version} + + + + com.webank.wedatasphere.linkis + linkis-storage + ${linkis.version} + provided + true + + + + com.webank.wedatasphere.linkis + linkis-module + ${linkis.version} + provided + true + + + + org.springframework + spring-context-support + 5.2.5.RELEASE + + + javax.mail + mail + 1.4 + + + + com.webank.wedatasphere.linkis + linkis-cs-client + ${linkis.version} + + + org.apache.httpcomponents + httpclient + 4.5.4 + compile + + + + + + + org.apache.maven.plugins + maven-deploy-plugin + + + net.alchim31.maven + scala-maven-plugin + + + org.apache.maven.plugins + maven-jar-plugin + + + org.apache.maven.plugins + maven-assembly-plugin + 2.3 + false + + + make-assembly + package + + single + + + + src/main/assembly/distribution.xml + + + + + + false + out + false + false + + src/main/assembly/distribution.xml + + + + + + + src/main/java + + **/*.xml + + + + src/main/resources + + **/*.properties + **/application.yml + **/bootstrap.yml + **/log4j2.xml + + + + + \ No newline at end of file diff --git a/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/assembly/distribution.xml b/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/assembly/distribution.xml new file mode 100644 index 000000000..60cac86d7 --- /dev/null +++ b/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/assembly/distribution.xml @@ -0,0 +1,64 @@ + + + + dss-sendemail-appconn + + dir + + true + sendemail + + + + + + lib + true + true + false + true + true + + + + + + ${basedir}/src/main/resources + + appconn.properties + + 0777 + / + unix + + + + ${basedir}/src/main/resources + + * + + 0777 + conf + unix + + + + + diff --git a/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/java/com/webank/wedatasphere/dss/appconn/sendemail/SendEmailAppConn.java b/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/java/com/webank/wedatasphere/dss/appconn/sendemail/SendEmailAppConn.java new file mode 100644 index 000000000..6d13c8156 --- /dev/null +++ b/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/java/com/webank/wedatasphere/dss/appconn/sendemail/SendEmailAppConn.java @@ -0,0 +1,61 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.sendemail; + +import com.webank.wedatasphere.dss.appconn.core.ext.OnlyDevelopmentAppConn; +import com.webank.wedatasphere.dss.appconn.core.impl.AbstractAppConn; +import com.webank.wedatasphere.dss.standard.app.development.operation.RefExecutionOperation; +import com.webank.wedatasphere.dss.standard.app.development.service.AbstractRefExecutionService; +import com.webank.wedatasphere.dss.standard.app.development.service.RefExecutionService; +import com.webank.wedatasphere.dss.standard.app.development.standard.DevelopmentIntegrationStandard; +import com.webank.wedatasphere.dss.standard.app.development.standard.OnlyExecutionDevelopmentStandard; + +public class SendEmailAppConn extends AbstractAppConn implements OnlyDevelopmentAppConn { + + private DevelopmentIntegrationStandard standard; + + @Override + protected void initialize() { + standard = new OnlyExecutionDevelopmentStandard() { + @Override + public void close() { + } + + @Override + protected RefExecutionService createRefExecutionService() { + return new AbstractRefExecutionService() { + private RefExecutionOperation refExecutionOperation = new SendEmailRefExecutionOperation(); + + @Override + public RefExecutionOperation createRefExecutionOperation() { + return refExecutionOperation; + } + }; + } + + @Override + public void init() { + + } + }; + } + + @Override + public DevelopmentIntegrationStandard getOrCreateDevelopmentStandard() { + return standard; + } +} diff --git a/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/java/com/webank/wedatasphere/dss/appconn/sendemail/conf/SendEmailAppConnInstanceConfiguration.java b/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/java/com/webank/wedatasphere/dss/appconn/sendemail/conf/SendEmailAppConnInstanceConfiguration.java new file mode 100644 index 000000000..e039ae37d --- /dev/null +++ b/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/java/com/webank/wedatasphere/dss/appconn/sendemail/conf/SendEmailAppConnInstanceConfiguration.java @@ -0,0 +1,96 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.sendemail.conf; + +import com.webank.wedatasphere.dss.appconn.sendemail.email.EmailGenerator; +import com.webank.wedatasphere.dss.appconn.sendemail.email.EmailSender; +import com.webank.wedatasphere.dss.appconn.sendemail.email.generate.MultiContentEmailGenerator; +import com.webank.wedatasphere.dss.appconn.sendemail.email.sender.SpringJavaEmailSender; +import com.webank.wedatasphere.dss.appconn.sendemail.emailcontent.EmailContentGenerator; +import com.webank.wedatasphere.dss.appconn.sendemail.emailcontent.EmailContentParser; +import com.webank.wedatasphere.dss.appconn.sendemail.emailcontent.generator.MultiEmailContentGenerator; +import com.webank.wedatasphere.dss.appconn.sendemail.emailcontent.parser.FileEmailContentParser$; +import com.webank.wedatasphere.dss.appconn.sendemail.emailcontent.parser.HtmlEmailContentParser$; +import com.webank.wedatasphere.dss.appconn.sendemail.emailcontent.parser.PictureEmailContentParser$; +import com.webank.wedatasphere.dss.appconn.sendemail.emailcontent.parser.TableEmailContentParser$; +import com.webank.wedatasphere.dss.appconn.sendemail.hook.SendEmailRefExecutionHook; +import com.webank.wedatasphere.dss.common.utils.ClassUtils; +import java.util.List; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; + +public class SendEmailAppConnInstanceConfiguration { + + private static final Logger logger = LoggerFactory.getLogger(SendEmailAppConnInstanceConfiguration.class); + + private static final EmailGenerator EMAIL_GENERATOR = new MultiContentEmailGenerator(); + + private static final EmailSender EMAIL_SENDER = createEmailSender(); + + private static final EmailContentGenerator[] EMAIL_CONTENT_GENERATOR = createEmailContentGenerators(); + + private static final EmailContentParser[] emailContentParsers = createEmailContentParsers(); + + private static final SendEmailRefExecutionHook[] sendEmailRefExecutionHooks = createSendEmailRefExecutionHooks(); + + private static EmailSender createEmailSender() { + EmailSender emailSender = ClassUtils.getInstanceOrDefault(EmailSender.class, new SpringJavaEmailSender()); + logger.info("Try to use {} to instance a EmailSender.", emailSender.getClass().getSimpleName()); + return emailSender; + } + + private static EmailContentGenerator[] createEmailContentGenerators() { + return new EmailContentGenerator[] {new MultiEmailContentGenerator()}; + } + + private static EmailContentParser[] createEmailContentParsers() { + return new EmailContentParser[] {FileEmailContentParser$.MODULE$, + HtmlEmailContentParser$.MODULE$, PictureEmailContentParser$.MODULE$, TableEmailContentParser$.MODULE$}; + } + + private static SendEmailRefExecutionHook[] createSendEmailRefExecutionHooks() { + List hooks = ClassUtils.getInstances(SendEmailRefExecutionHook.class); + logger.info("SendEmailRefExecutionHook list is {}.", hooks); + return hooks.toArray(new SendEmailRefExecutionHook[0]); + } + + public static EmailSender getEmailSender() { + return EMAIL_SENDER; + } + + public static void init(){ + logger.info("init SendEmailAppConnInstanceConfiguration"); + } + + public static EmailGenerator getEmailGenerator() { + return EMAIL_GENERATOR; + } + + + public static EmailContentGenerator[] getEmailContentGenerators() { + return EMAIL_CONTENT_GENERATOR; + } + + public static EmailContentParser[] getEmailContentParsers() { + return emailContentParsers; + } + + public static SendEmailRefExecutionHook[] getSendEmailRefExecutionHooks() { + return sendEmailRefExecutionHooks; + } + +} diff --git a/sendemail-appjoint/sendemail-core/src/main/java/com/webank/wedatasphere/dss/appjoint/sendemail/Email.java b/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/java/com/webank/wedatasphere/dss/appconn/sendemail/email/Email.java similarity index 82% rename from sendemail-appjoint/sendemail-core/src/main/java/com/webank/wedatasphere/dss/appjoint/sendemail/Email.java rename to dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/java/com/webank/wedatasphere/dss/appconn/sendemail/email/Email.java index 61e1021e3..91ec8ed78 100644 --- a/sendemail-appjoint/sendemail-core/src/main/java/com/webank/wedatasphere/dss/appjoint/sendemail/Email.java +++ b/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/java/com/webank/wedatasphere/dss/appconn/sendemail/email/Email.java @@ -1,8 +1,7 @@ /* * Copyright 2019 WeBank - * * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. + * you may not use this file except in compliance with the License. * You may obtain a copy of the License at * * http://www.apache.org/licenses/LICENSE-2.0 @@ -15,10 +14,10 @@ * */ -package com.webank.wedatasphere.dss.appjoint.sendemail; -/** - * Created by shanhuang on 2019/10/12. - */ +package com.webank.wedatasphere.dss.appconn.sendemail.email; + +import com.webank.wedatasphere.dss.appconn.sendemail.email.domain.Attachment; + public interface Email { String getContent(); diff --git a/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/java/com/webank/wedatasphere/dss/appconn/sendemail/email/sender/AbstractEmailSender.java b/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/java/com/webank/wedatasphere/dss/appconn/sendemail/email/sender/AbstractEmailSender.java new file mode 100644 index 000000000..53e7f7d4a --- /dev/null +++ b/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/java/com/webank/wedatasphere/dss/appconn/sendemail/email/sender/AbstractEmailSender.java @@ -0,0 +1,36 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.sendemail.email.sender; + +import com.webank.wedatasphere.dss.appconn.sendemail.email.Email; +import com.webank.wedatasphere.dss.appconn.sendemail.email.EmailSender; +import com.webank.wedatasphere.linkis.common.utils.Utils; +import scala.runtime.BoxedUnit; + +import java.util.concurrent.Future; + +public abstract class AbstractEmailSender implements EmailSender { + + @Override + public Future sendAsync(Email email) { + return Utils.defaultScheduler().submit(() -> { + send(email); + return BoxedUnit.UNIT; + }); + } + +} diff --git a/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/java/com/webank/wedatasphere/dss/appconn/sendemail/email/sender/SpringJavaEmailSender.java b/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/java/com/webank/wedatasphere/dss/appconn/sendemail/email/sender/SpringJavaEmailSender.java new file mode 100644 index 000000000..b36bb337d --- /dev/null +++ b/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/java/com/webank/wedatasphere/dss/appconn/sendemail/email/sender/SpringJavaEmailSender.java @@ -0,0 +1,99 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.sendemail.email.sender; + +import com.webank.wedatasphere.dss.appconn.sendemail.conf.SendEmailAppConnConfiguration; +import com.webank.wedatasphere.dss.appconn.sendemail.email.Email; +import com.webank.wedatasphere.dss.appconn.sendemail.email.domain.Attachment; +import com.webank.wedatasphere.dss.appconn.sendemail.exception.EmailSendFailedException; + +import java.util.Properties; +import javax.mail.internet.MimeMessage; +import javax.mail.util.ByteArrayDataSource; + +import org.apache.commons.lang.StringUtils; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; +import org.springframework.mail.javamail.JavaMailSenderImpl; +import org.springframework.mail.javamail.MimeMessageHelper; + +public class SpringJavaEmailSender extends AbstractEmailSender { + + + private static final Logger logger = LoggerFactory.getLogger(SpringJavaEmailSender.class); + + private JavaMailSenderImpl javaMailSender = new JavaMailSenderImpl(); + + public SpringJavaEmailSender() { + try { + Properties prop = new Properties(); + prop.put("mail.smtp.auth", Boolean.parseBoolean(SendEmailAppConnConfiguration.EMAIL_SMTP_AUTH().getValue())); + prop.put("mail.smtp.starttls.enable", Boolean.parseBoolean(SendEmailAppConnConfiguration.EMAIL_SMTP_STARTTLS_ENABLE().getValue())); + prop.put("mail.smtp.starttls.required", Boolean.parseBoolean(SendEmailAppConnConfiguration.EMAIL_SMTP_STARTTLS_REQUIRED().getValue())); + prop.put("mail.smtp.ssl.enable", Boolean.parseBoolean(SendEmailAppConnConfiguration.EMAIL_SMTP_SSL_ENABLED().getValue())); + prop.put("mail.smtp.timeout", Integer.parseInt(SendEmailAppConnConfiguration.EMAIL_SMTP_TIMEOUT().getValue())); + javaMailSender.setJavaMailProperties(prop); + } catch (Exception e) { + logger.error("Failed to read mail properties, roll back to default values.", e); + } + } + + @Override + public void send(Email email) throws EmailSendFailedException { + logger.info("Begin to send Email({}).", email.getSubject()); + try { + javaMailSender.send(parseToMimeMessage(email)); + } catch (Exception e) { + logger.error("Send email failed: ", e); + EmailSendFailedException ex = new EmailSendFailedException(80001, "Send email failed!"); + ex.initCause(e); + throw ex; + } + logger.info("Send Email({}) succeed.", email.getSubject()); + } + + private MimeMessage parseToMimeMessage(Email email) { + MimeMessage message = javaMailSender.createMimeMessage(); + try { + MimeMessageHelper messageHelper = new MimeMessageHelper(message, true); + if (StringUtils.isBlank(email.getFrom())) { + messageHelper.setFrom(SendEmailAppConnConfiguration.DEFAULT_EMAIL_FROM().getValue()); + } else { + messageHelper.setFrom(email.getFrom()); + } + messageHelper.setSubject(email.getSubject()); + messageHelper.setTo(email.getTo()); + if (StringUtils.isNotBlank(email.getCc())) { + messageHelper.setCc(email.getCc()); + } + if (StringUtils.isNotBlank(email.getBcc())) { + messageHelper.setBcc(email.getBcc()); + } + for (Attachment attachment : email.getAttachments()) { + messageHelper.addAttachment(attachment.getName(), new ByteArrayDataSource(attachment.getBase64Str(), attachment.getMediaType())); + } + messageHelper.setText(email.getContent(), true); + } catch (Exception e) { + logger.error("Send mail failed", e); + } + return message; + } + + public JavaMailSenderImpl getJavaMailSender() { + return javaMailSender; + } +} diff --git a/sendemail-appjoint/sendemail-core/src/main/java/com/webank/wedatasphere/dss/appjoint/sendemail/EmailContent.java b/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/java/com/webank/wedatasphere/dss/appconn/sendemail/emailcontent/EmailContent.java similarity index 77% rename from sendemail-appjoint/sendemail-core/src/main/java/com/webank/wedatasphere/dss/appjoint/sendemail/EmailContent.java rename to dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/java/com/webank/wedatasphere/dss/appconn/sendemail/emailcontent/EmailContent.java index a024c85df..c69b0d5f0 100644 --- a/sendemail-appjoint/sendemail-core/src/main/java/com/webank/wedatasphere/dss/appjoint/sendemail/EmailContent.java +++ b/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/java/com/webank/wedatasphere/dss/appconn/sendemail/emailcontent/EmailContent.java @@ -1,8 +1,7 @@ /* * Copyright 2019 WeBank - * * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. + * you may not use this file except in compliance with the License. * You may obtain a copy of the License at * * http://www.apache.org/licenses/LICENSE-2.0 @@ -15,11 +14,8 @@ * */ -package com.webank.wedatasphere.dss.appjoint.sendemail; +package com.webank.wedatasphere.dss.appconn.sendemail.emailcontent; -/** - * Created by shanhuang on 2019/10/12. - */ public interface EmailContent { T getContent(); diff --git a/sendemail-appjoint/sendemail-core/src/main/java/com/webank/wedatasphere/dss/appjoint/sendemail/exception/EmailSendFailedException.java b/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/java/com/webank/wedatasphere/dss/appconn/sendemail/exception/EmailSendFailedException.java similarity index 80% rename from sendemail-appjoint/sendemail-core/src/main/java/com/webank/wedatasphere/dss/appjoint/sendemail/exception/EmailSendFailedException.java rename to dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/java/com/webank/wedatasphere/dss/appconn/sendemail/exception/EmailSendFailedException.java index 2f8b225a7..1f6aa8da7 100644 --- a/sendemail-appjoint/sendemail-core/src/main/java/com/webank/wedatasphere/dss/appjoint/sendemail/exception/EmailSendFailedException.java +++ b/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/java/com/webank/wedatasphere/dss/appconn/sendemail/exception/EmailSendFailedException.java @@ -1,8 +1,7 @@ /* * Copyright 2019 WeBank - * * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. + * you may not use this file except in compliance with the License. * You may obtain a copy of the License at * * http://www.apache.org/licenses/LICENSE-2.0 @@ -15,13 +14,11 @@ * */ -package com.webank.wedatasphere.dss.appjoint.sendemail.exception; +package com.webank.wedatasphere.dss.appconn.sendemail.exception; import com.webank.wedatasphere.linkis.common.exception.ErrorException; -/** - * Created by shanhuang on 2019/10/12. - */ + public class EmailSendFailedException extends ErrorException { public EmailSendFailedException(int errCode, String desc) { super(errCode, desc); diff --git a/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/java/com/webank/wedatasphere/dss/appconn/sendemail/hook/EmailInfo.java b/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/java/com/webank/wedatasphere/dss/appconn/sendemail/hook/EmailInfo.java new file mode 100644 index 000000000..7060ac95d --- /dev/null +++ b/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/java/com/webank/wedatasphere/dss/appconn/sendemail/hook/EmailInfo.java @@ -0,0 +1,139 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.sendemail.hook; + +import java.util.Map; + +public class EmailInfo { + private String formId; + private String user; + private String dssProject; + private String widgets; + private String cc; + private String to; + private String bcc; + private String status; + private String priority; + private String alertList; + private int alertInterval = 60; + private String requestCreatedate; + private Map widgetColumns; + + public String getFormId() { + return formId; + } + + public void setFormId(String formId) { + this.formId = formId; + } + + public String getUser() { + return user; + } + + public void setUser(String user) { + this.user = user; + } + + public String getDssProject() { + return dssProject; + } + + public void setDssProject(String dssProject) { + this.dssProject = dssProject; + } + + public String getWidgets() { + return widgets; + } + + public void setWidgets(String widgets) { + this.widgets = widgets; + } + + public String getCc() { + return cc; + } + + public void setCc(String cc) { + this.cc = cc; + } + + public String getTo() { + return to; + } + + public void setTo(String to) { + this.to = to; + } + + public String getBcc() { + return bcc; + } + + public void setBcc(String bcc) { + this.bcc = bcc; + } + + public String getStatus() { + return status; + } + + public void setStatus(String status) { + this.status = status; + } + + public String getPriority() { + return priority; + } + + public void setPriority(String priority) { + this.priority = priority; + } + + public String getAlertList() { + return alertList; + } + + public void setAlertList(String alertList) { + this.alertList = alertList; + } + + public int getAlertInterval() { + return alertInterval; + } + + public void setAlertInterval(int alertInterval) { + this.alertInterval = alertInterval; + } + + public String getRequestCreatedate() { + return requestCreatedate; + } + + public void setRequestCreatedate(String requestCreatedate) { + this.requestCreatedate = requestCreatedate; + } + + public Map getWidgetColumns() { + return widgetColumns; + } + + public void setWidgetColumns(Map widgetColumns) { + this.widgetColumns = widgetColumns; + } +} diff --git a/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/java/com/webank/wedatasphere/dss/appconn/sendemail/hook/HttpClientUtil.java b/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/java/com/webank/wedatasphere/dss/appconn/sendemail/hook/HttpClientUtil.java new file mode 100644 index 000000000..c7dea2f3b --- /dev/null +++ b/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/java/com/webank/wedatasphere/dss/appconn/sendemail/hook/HttpClientUtil.java @@ -0,0 +1,470 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.sendemail.hook; + + +import org.apache.http.Consts; +import org.apache.http.HttpEntity; +import org.apache.http.HttpResponse; +import org.apache.http.NameValuePair; +import org.apache.http.client.ClientProtocolException; +import org.apache.http.client.config.RequestConfig; +import org.apache.http.client.entity.UrlEncodedFormEntity; +import org.apache.http.client.methods.CloseableHttpResponse; +import org.apache.http.client.methods.HttpGet; +import org.apache.http.client.methods.HttpPost; +import org.apache.http.entity.ContentType; +import org.apache.http.entity.StringEntity; +import org.apache.http.impl.client.CloseableHttpClient; +import org.apache.http.impl.client.HttpClients; +import org.apache.http.impl.conn.PoolingHttpClientConnectionManager; +import org.apache.http.message.BasicNameValuePair; +import org.apache.http.util.EntityUtils; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; + +import javax.net.ssl.TrustManager; +import javax.net.ssl.X509TrustManager; +import java.io.IOException; +import java.io.UnsupportedEncodingException; +import java.net.SocketTimeoutException; +import java.net.URLEncoder; +import java.security.cert.CertificateException; +import java.text.SimpleDateFormat; +import java.util.*; + +@SuppressWarnings("all") +public final class HttpClientUtil { + private final static Logger logger = LoggerFactory.getLogger(HttpClientUtil.class); + public final static int connectTimeout = 5000; + private static PoolingHttpClientConnectionManager connManager = null; + private static CloseableHttpClient httpclient = null; + + private static TrustManager trustAllManager = new X509TrustManager() { + @Override + public void checkClientTrusted(java.security.cert.X509Certificate[] arg0, String arg1) + throws CertificateException { + } + @Override + public void checkServerTrusted(java.security.cert.X509Certificate[] arg0, String arg1) + throws CertificateException { + } + @Override + public java.security.cert.X509Certificate[] getAcceptedIssuers() { + return null; + } + + }; + + static { + httpclient = HttpClients.createDefault(); + } + + + /** + * + * @param url + * @param timeout + * @param headerMap + * @param paramsList + * @param encoding + * @return + */ + public static String postForm(String url, int timeout, Map headerMap, List paramsList, String encoding){ + HttpPost post = new HttpPost(url); + try { + if(headerMap != null){ + for(Map.Entry entry : headerMap.entrySet()){ + post.setHeader(entry.getKey(), entry.getValue().toString()); + } + } + //post.setHeader("Content-type", "application/json"); + RequestConfig requestConfig = RequestConfig.custom() + .setSocketTimeout(timeout) + .setConnectTimeout(timeout) + .setConnectionRequestTimeout(timeout) + .setExpectContinueEnabled(false).build(); + post.setConfig(requestConfig); + + post.setEntity(new UrlEncodedFormEntity(paramsList, encoding)); + CloseableHttpResponse response = httpclient.execute(post); + try { + HttpEntity entity = response.getEntity(); + try { + if(entity != null){ + String str = EntityUtils.toString(entity, encoding); + return str; + } + } finally { + if(entity != null){ + entity.getContent().close(); + } + } + } finally { + if(response != null){ + response.close(); + } + } + } catch (Exception e) { + throw new RuntimeException("invoke http post error!",e); + } finally { + post.releaseConnection(); + } + return ""; + } + + /** + * 调用saltapi时 + * + * @author: XIEJIAN948@pingan.com.cn + */ + public static String postJsonBody(String url, int timeout, Map headerMap, + String paraData, String encoding) { + + logger.info("successfully start post Json Body url{} ", url); + HttpPost post = new HttpPost(url); + try { + if (headerMap != null) { + for (Map.Entry entry : headerMap.entrySet()) { + post.setHeader(entry.getKey(), entry.getValue().toString()); + } + } + RequestConfig requestConfig = RequestConfig.custom().setSocketTimeout(timeout).setConnectTimeout(timeout) + .setConnectionRequestTimeout(timeout).setExpectContinueEnabled(false).build(); + StringEntity jsonEntity = new StringEntity(paraData, ContentType.APPLICATION_JSON); + post.setConfig(requestConfig); + post.setEntity(jsonEntity); + CloseableHttpResponse response = httpclient.execute(post); + try { + HttpEntity entity = response.getEntity(); + try { + if (entity != null) { + String str = EntityUtils.toString(entity, encoding); + return str; + } + } finally { + if (entity != null) { + entity.getContent().close(); + } + } + } finally { + if (response != null) { + response.close(); + } + } + } catch (UnsupportedEncodingException e) { + logger.error("UnsupportedEncodingException", e); + throw new RuntimeException("failed post json return blank!"); + } catch (Exception e) { + logger.error("Exception", e); + throw new RuntimeException("failed post json return blank!"); + } finally { + post.releaseConnection(); + } + logger.info("successfully end post Json Body url{} ", url); + return ""; + } + + @SuppressWarnings("deprecation") + public static String invokeGet(String url, Map params, String encode, int connectTimeout, + int soTimeout) { + String responseString = null; + RequestConfig requestConfig = RequestConfig.custom().setSocketTimeout(connectTimeout) + .setConnectTimeout(connectTimeout).setConnectionRequestTimeout(connectTimeout).build(); + + StringBuilder sb = new StringBuilder(); + sb.append(url); + int i = 0; + if (params != null) { + for (Map.Entry entry : params.entrySet()) { + if (i == 0 && !url.contains("?")) { + sb.append("?"); + } else { + sb.append("&"); + } + sb.append(entry.getKey()); + sb.append("="); + String value = entry.getValue(); + try { + sb.append(URLEncoder.encode(value, "UTF-8")); + } catch (UnsupportedEncodingException e) { + logger.warn("encode http get params error, value is " + value, e); + sb.append(URLEncoder.encode(value)); + } + i++; + } + } + HttpGet get = new HttpGet(sb.toString()); + get.setConfig(requestConfig); + try { + CloseableHttpResponse response = httpclient.execute(get); + try { + HttpEntity entity = response.getEntity(); + try { + if (entity != null) { + responseString = EntityUtils.toString(entity, encode); + } + } finally { + if (entity != null) { + entity.getContent().close(); + } + } + } catch (Exception e) { + logger.error(String.format("[HttpUtils Get]get response error, url:%s", sb.toString()), e); + return responseString; + } finally { + if (response != null) { + response.close(); + } + } + // System.out.println(String.format("[HttpUtils Get]Debug url:%s , + // response string %s:", sb.toString(), responseString)); + } catch (SocketTimeoutException e) { + logger.error(String.format("[HttpUtils Get]invoke get timout error, url:%s", sb.toString()), e); + return responseString; + } catch (Exception e) { + logger.error(String.format("[HttpUtils Get]invoke get error, url:%s", sb.toString()), e); + } finally { + get.releaseConnection(); + } + return responseString; + } + + /** + * HTTPS请求,默认超时为5S + * + * @param reqURL + * @param params + * @return + */ + public static String connectPostHttps(String reqURL, Map params) { + + String responseContent = null; + HttpPost httpPost = new HttpPost(reqURL); + try { + RequestConfig requestConfig = RequestConfig.custom().setSocketTimeout(connectTimeout) + .setConnectTimeout(connectTimeout).setConnectionRequestTimeout(connectTimeout).build(); + List formParams = new ArrayList(); + httpPost.setEntity(new UrlEncodedFormEntity(formParams, Consts.UTF_8)); + httpPost.setConfig(requestConfig); + // 绑定到请求 Entry + for (Map.Entry entry : params.entrySet()) { + formParams.add(new BasicNameValuePair(entry.getKey(), entry.getValue())); + } + CloseableHttpResponse response = httpclient.execute(httpPost); + try { + // 执行POST请求 + HttpEntity entity = response.getEntity(); // 获取响应实体 + try { + if (null != entity) { + responseContent = EntityUtils.toString(entity, Consts.UTF_8); + } + } finally { + if (entity != null) { + entity.getContent().close(); + } + } + } finally { + if (response != null) { + response.close(); + } + } + logger.info("requestURI : " + httpPost.getURI() + ", responseContent: " + responseContent); + } catch (ClientProtocolException e) { + logger.error("ClientProtocolException", e); + } catch (IOException e) { + logger.error("IOException", e); + } finally { + httpPost.releaseConnection(); + } + return responseContent; + + } + + class Test { + String v; + String k; + + public String getV() { + return v; + } + + public void setV(String v) { + this.v = v; + } + + public String getK() { + return k; + } + + public void setK(String k) { + this.k = k; + } + + } + + // 随机4位数 + public static String getRandomValue() { + String str = "0123456789"; + StringBuilder sb = new StringBuilder(4); + for (int i = 0; i < 4; i++) { + char ch = str.charAt(new Random().nextInt(str.length())); + sb.append(ch); + } + return sb.toString(); + } + + // 当前时间到秒 + public static String getTimestamp() { + + Date date = new Date(); + String timestamp = String.valueOf(date.getTime() / 1000); + return timestamp; + } + + // 当前时间到秒 + public static String getNowDate() { + Date date = new Date(); + SimpleDateFormat sdf = new SimpleDateFormat("yyyyMMddHHmmss"); + return sdf.format(date); + } + + /** + * 调用saltapi时 + * + * @author: XIEJIAN948@pingan.com.cn + */ + public static String postJsonBody2(String url, int timeout, Map headerMap, + List paramsList, String encoding) { + logger.info("successfully start post Json Body url{} ", url); + HttpPost post = new HttpPost(url); + try { + if (headerMap != null) { + for (Map.Entry entry : headerMap.entrySet()) { + post.setHeader(entry.getKey(), entry.getValue().toString()); + } + } + RequestConfig requestConfig = RequestConfig.custom().setSocketTimeout(timeout).setConnectTimeout(timeout) + .setConnectionRequestTimeout(timeout).setExpectContinueEnabled(false).build(); + post.setConfig(requestConfig); + if (paramsList.size() > 0) { + UrlEncodedFormEntity entity = new UrlEncodedFormEntity(paramsList, encoding); + post.setEntity(entity); + } + CloseableHttpResponse response = httpclient.execute(post); + try { + HttpEntity entity = response.getEntity(); + try { + if (entity != null) { + String str = EntityUtils.toString(entity, encoding); + return str; + } + } finally { + if (entity != null) { + entity.getContent().close(); + } + } + } finally { + if (response != null) { + response.close(); + } + } + } catch (UnsupportedEncodingException e) { + logger.error("UnsupportedEncodingException", e); + throw new RuntimeException("failed post json return blank!"); + } catch (Exception e) { + logger.error("Exception", e); + throw new RuntimeException("failed post json return blank!"); + } finally { + post.releaseConnection(); + } + logger.info("successfully end post Json Body url{} ", url); + return ""; + } + + /** + * 调用saltapi时 + * + * @author: XIEJIAN948@pingan.com.cn + */ + public static String postJsonBody3(String url, int timeout, Map headerMap, + Map paramsList, String encoding) { + HttpPost post = new HttpPost(url); + try { + if (headerMap != null) { + for (Map.Entry entry : headerMap.entrySet()) { + post.setHeader(entry.getKey(), entry.getValue().toString()); + } + } + RequestConfig requestConfig = RequestConfig.custom().setSocketTimeout(timeout).setConnectTimeout(timeout) + .setConnectionRequestTimeout(timeout).setExpectContinueEnabled(false).build(); + post.setConfig(requestConfig); + if (paramsList.size() > 0) { + //JSONArray jsonArray = JSONArray.fromObject(paramsList); + //post.setEntity(new StringEntity(jsonArray.get(0).toString(), encoding)); + post.setEntity(new StringEntity(null, encoding)); + //logger.info("successfully start post Json Body url{},params ", url,jsonArray.get(0).toString()); + logger.info("successfully start post Json Body url{},params ", url,null); + } + CloseableHttpResponse response = httpclient.execute(post); + try { + HttpEntity entity = response.getEntity(); + try { + if (entity != null) { + String str = EntityUtils.toString(entity, encoding); + return str; + } + } finally { + if (entity != null) { + entity.getContent().close(); + } + } + } finally { + if (response != null) { + response.close(); + } + } + } catch (UnsupportedEncodingException e) { + logger.error("UnsupportedEncodingException", e); + throw new RuntimeException("failed post json return blank!"); + } catch (Exception e) { + logger.error("Exception", e); + throw new RuntimeException("failed post json return blank!"); + } finally { + post.releaseConnection(); + } + logger.info("successfully end post Json Body url{} ", url); + return ""; + } + + public static String executeGet(String url) + { + String rtnStr = ""; + HttpGet httpGet = new HttpGet(url); + try { + HttpResponse httpResponse = httpclient.execute(httpGet); + //获得返回的结果 + rtnStr = EntityUtils.toString(httpResponse.getEntity()); + } catch (IOException e) { + e.printStackTrace(); + } finally { + httpGet.releaseConnection(); + } + return rtnStr; + } + +} diff --git a/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/java/com/webank/wedatasphere/dss/appconn/sendemail/hook/HttpResponseModel.java b/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/java/com/webank/wedatasphere/dss/appconn/sendemail/hook/HttpResponseModel.java new file mode 100644 index 000000000..a6965b114 --- /dev/null +++ b/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/java/com/webank/wedatasphere/dss/appconn/sendemail/hook/HttpResponseModel.java @@ -0,0 +1,47 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.sendemail.hook; + +public abstract class HttpResponseModel { + private String method; + private int status; + private String message; + + public String getMethod() { + return method; + } + + public void setMethod(String method) { + this.method = method; + } + + public int getStatus() { + return status; + } + + public void setStatus(int status) { + this.status = status; + } + + public String getMessage() { + return message; + } + + public void setMessage(String message) { + this.message = message; + } +} diff --git a/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/java/com/webank/wedatasphere/dss/appconn/sendemail/hook/WidgetMetaData.java b/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/java/com/webank/wedatasphere/dss/appconn/sendemail/hook/WidgetMetaData.java new file mode 100644 index 000000000..1e268538f --- /dev/null +++ b/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/java/com/webank/wedatasphere/dss/appconn/sendemail/hook/WidgetMetaData.java @@ -0,0 +1,85 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.sendemail.hook; + +import java.util.List; +import java.util.Map; + +public class WidgetMetaData extends HttpResponseModel{ + public static class Meta{ + private String name; + private String updated; + private String columns; + + public String getName() { + return name; + } + + public void setName(String name) { + this.name = name; + } + + public String getUpdated() { + return updated; + } + + public void setUpdated(String updated) { + this.updated = updated; + } + + public String getColumns() { + return columns; + } + + public void setColumns(String columns) { + this.columns = columns; + } + } + + + public static class Data{ + private String projectName; + private List widgetsMetaData; + + public String getProjectName() { + return projectName; + } + + public void setProjectName(String projectName) { + this.projectName = projectName; + } + + public List getWidgetsMetaData() { + return widgetsMetaData; + } + + public void setWidgetsMetaData(List widgetsMetaData) { + this.widgetsMetaData = widgetsMetaData; + } + } + + private Data data; + + public Data getData() { + return data; + } + + public void setData(Data data) { + this.data = data; + } +} + diff --git a/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/scala/com/webank/wedatasphere/dss/appconn/sendemail/SendEmailRefExecutionOperation.scala b/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/scala/com/webank/wedatasphere/dss/appconn/sendemail/SendEmailRefExecutionOperation.scala new file mode 100644 index 000000000..29ab4a9b2 --- /dev/null +++ b/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/scala/com/webank/wedatasphere/dss/appconn/sendemail/SendEmailRefExecutionOperation.scala @@ -0,0 +1,104 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.sendemail + +import java.util.Properties + +import com.webank.wedatasphere.dss.appconn.sendemail.conf.SendEmailAppConnInstanceConfiguration +import com.webank.wedatasphere.dss.appconn.sendemail.email.EmailSender +import com.webank.wedatasphere.dss.appconn.sendemail.email.sender.SpringJavaEmailSender +import com.webank.wedatasphere.dss.common.utils.ClassUtils +import com.webank.wedatasphere.dss.standard.app.development.listener.common.CompletedExecutionResponseRef +import com.webank.wedatasphere.dss.standard.app.development.operation.RefExecutionOperation +import com.webank.wedatasphere.dss.standard.app.development.ref.ExecutionRequestRef +import com.webank.wedatasphere.dss.standard.app.development.service.DevelopmentService +import com.webank.wedatasphere.dss.standard.common.entity.ref.ResponseRef +import com.webank.wedatasphere.linkis.common.exception.ErrorException +import com.webank.wedatasphere.linkis.common.utils.{Logging, Utils} +import org.springframework.mail.javamail.JavaMailSender + +import scala.collection.JavaConversions._ + +class SendEmailRefExecutionOperation extends RefExecutionOperation with Logging { + val EMAIL_FROM_DEFAULT = "email.from.default" + val EMAIL_HOST = "email.host" + val EMAIL_USERNAME = "email.username" + val EMAIL_PASSWORD = "email.password" + val EMAIL_PORT = "email.port" + val EMAIL_PROTOCOL = "email.protocol" + + private var service:DevelopmentService = _ + + private val sendEmailAppConnHooks = SendEmailAppConnInstanceConfiguration.getSendEmailRefExecutionHooks + private val emailContentParsers = SendEmailAppConnInstanceConfiguration.getEmailContentParsers + private val emailContentGenerators = SendEmailAppConnInstanceConfiguration.getEmailContentGenerators + private val emailGenerator = SendEmailAppConnInstanceConfiguration.getEmailGenerator + private val emailSender = SendEmailAppConnInstanceConfiguration.getEmailSender + + override def execute(requestRef: ExecutionRequestRef): ResponseRef = { + val instanceConfig = this.service.getAppInstance.getConfig + val properties = new Properties() + instanceConfig.foreach { + case (key: String, value: Object) => + properties.put(key, value.toString) + } + val springJavaEmailSender = new SpringJavaEmailSender() + val javaMailSender = springJavaEmailSender.getJavaMailSender + javaMailSender.setHost(properties.getProperty(EMAIL_HOST)) + javaMailSender.setPort(Integer.parseInt(properties.getProperty(EMAIL_PORT))) + javaMailSender.setUsername(properties.getProperty(EMAIL_USERNAME)) + javaMailSender.setPassword(properties.getProperty(EMAIL_PASSWORD)) + javaMailSender.setProtocol(properties.getProperty(EMAIL_PROTOCOL)) + val emailSender = ClassUtils.getInstanceOrDefault(classOf[EmailSender],springJavaEmailSender) + val response = new CompletedExecutionResponseRef(200) + val email = Utils.tryCatch { + sendEmailAppConnHooks.foreach(_.preGenerate(requestRef)) + val email = emailGenerator.generateEmail(requestRef) + emailContentParsers.foreach{ + p => Utils.tryQuietly(p.parse(email)) + } + emailContentGenerators.foreach{ + g => Utils.tryQuietly(g.generate(email)) + } + sendEmailAppConnHooks.foreach(_.preSend(requestRef, email)) + email + }{ t => + putErrorMsg("解析邮件内容失败!", t, response) + return response + } + Utils.tryCatch { + emailSender.send(email) + response.setIsSucceed(true) + }(putErrorMsg("发送邮件失败!", _, response)) + response + } + + protected def putErrorMsg(errorMsg: String, t: Throwable, + response: CompletedExecutionResponseRef): Unit = t match { + case t: Throwable => + response.setErrorMsg(errorMsg) + val exception = new ErrorException(80079, "failed to sendEmail") + exception.initCause(t) + logger.error(s"failed to send email, $errorMsg ", t) + response.setException(exception) + response.setIsSucceed(false) + } + + override def setDevelopmentService(service: DevelopmentService): Unit = { + this.service = service + } +} diff --git a/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/scala/com/webank/wedatasphere/dss/appconn/sendemail/conf/SendEmailAppConnConfiguration.scala b/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/scala/com/webank/wedatasphere/dss/appconn/sendemail/conf/SendEmailAppConnConfiguration.scala new file mode 100644 index 000000000..ecabe6235 --- /dev/null +++ b/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/scala/com/webank/wedatasphere/dss/appconn/sendemail/conf/SendEmailAppConnConfiguration.scala @@ -0,0 +1,41 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.sendemail.conf + +import com.webank.wedatasphere.linkis.common.conf.CommonVars + +object SendEmailAppConnConfiguration { + + val EMAIL_IMAGE_HEIGHT = CommonVars("wds.dss.appconn.email.image.height", 500) + val EMAIL_IMAGE_WIDTH = CommonVars("wds.dss.appconn.email.image.width", 1920) + val DEFAULT_EMAIL_FROM = CommonVars("wds.dss.appconn.email.from.default", "") + val DEFAULT_EMAIL_SUFFIX = CommonVars("wds.dss.appconn.email.suffix.default", "@webank.com") + + val DEV_CHECK = CommonVars("wds.dss.appconn.email.dev.check", true) + val EMAIL_HOST = CommonVars("wds.dss.appconn.email.host", "") + val EMAIL_PORT = CommonVars("wds.dss.appconn.email.port", "") + val EMAIL_PROTOCOL = CommonVars("wds.dss.appconn.email.protocol", "smtp") + val EMAIL_USERNAME = CommonVars("wds.dss.appconn.email.username", "") + val EMAIL_PASSWORD = CommonVars("wds.dss.appconn.email.password", "") + + val EMAIL_SMTP_AUTH = CommonVars("wds.dss.appconn.email.smtp.auth", "true") + val EMAIL_SMTP_STARTTLS_ENABLE = CommonVars("wds.dss.appconn.email.smtp.starttls.enable", "true") + val EMAIL_SMTP_STARTTLS_REQUIRED = CommonVars("wds.dss.appconn.email.smtp.starttls.required", "true") + val EMAIL_SMTP_SSL_ENABLED = CommonVars("wds.dss.appconn.email.smtp.ssl.enable", "true") + val EMAIL_SMTP_TIMEOUT = CommonVars("wds.dss.appconn.email.smtp.timeout", "25000") + +} diff --git a/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/scala/com/webank/wedatasphere/dss/appconn/sendemail/cs/EmailCSHelper.scala b/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/scala/com/webank/wedatasphere/dss/appconn/sendemail/cs/EmailCSHelper.scala new file mode 100644 index 000000000..6e882eb53 --- /dev/null +++ b/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/scala/com/webank/wedatasphere/dss/appconn/sendemail/cs/EmailCSHelper.scala @@ -0,0 +1,60 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.sendemail.cs + +import com.webank.wedatasphere.dss.appconn.sendemail.exception.EmailSendFailedException +import com.webank.wedatasphere.dss.standard.app.development.listener.core.ExecutionRequestRefContext +import com.webank.wedatasphere.linkis.common.utils.Logging +import com.webank.wedatasphere.linkis.cs.client.service.LinkisJobDataServiceImpl +import com.webank.wedatasphere.linkis.cs.client.utils.{ContextServiceUtils, SerializeHelper} +import com.webank.wedatasphere.linkis.cs.common.entity.enumeration.{ContextScope, ContextType} +import com.webank.wedatasphere.linkis.cs.common.entity.source.CommonContextKey +import com.webank.wedatasphere.linkis.cs.common.utils.CSCommonUtils +import com.webank.wedatasphere.linkis.server.JSONUtils + +import scala.collection.JavaConversions._ + + +object EmailCSHelper extends Logging{ + + /** + * update by peaceWong form cs to get job ID + */ + def getJobIds(refContext: ExecutionRequestRefContext): Array[Long] = { + val contextIDStr = ContextServiceUtils.getContextIDStrByMap(refContext.getRuntimeMap) + val nodeIDs = refContext.getRuntimeMap.get("content") match { + case string: String => JSONUtils.gson.fromJson(string, classOf[java.util.List[String]]) + case list: java.util.List[String] => list + } + if (null == nodeIDs || nodeIDs.length < 1){ + throw new EmailSendFailedException(80003 ,"empty result set is not allowed") + } + info(s"From cs to getJob ids $nodeIDs.") + val jobIds = nodeIDs.map(ContextServiceUtils.getNodeNameByNodeID(contextIDStr, _)).map{ nodeName => + val contextKey = new CommonContextKey + contextKey.setContextScope(ContextScope.PUBLIC) + contextKey.setContextType(ContextType.DATA) + contextKey.setKey(CSCommonUtils.NODE_PREFIX + nodeName + CSCommonUtils.JOB_ID) + LinkisJobDataServiceImpl.getInstance().getLinkisJobData(contextIDStr, SerializeHelper.serializeContextKey(contextKey)) + }.map(_.getJobID).toArray + if (null == jobIds || jobIds.length < 1){ + throw new EmailSendFailedException(80003 ,"empty result set is not allowed") + } + info(s"Job IDs is ${jobIds.toList}.") + jobIds + } +} diff --git a/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/scala/com/webank/wedatasphere/dss/appconn/sendemail/email/EmailGenerator.scala b/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/scala/com/webank/wedatasphere/dss/appconn/sendemail/email/EmailGenerator.scala new file mode 100644 index 000000000..37fccbc9d --- /dev/null +++ b/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/scala/com/webank/wedatasphere/dss/appconn/sendemail/email/EmailGenerator.scala @@ -0,0 +1,25 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.sendemail.email + +import com.webank.wedatasphere.dss.standard.app.development.ref.ExecutionRequestRef + +trait EmailGenerator { + + def generateEmail(requestRef: ExecutionRequestRef): Email + +} \ No newline at end of file diff --git a/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/scala/com/webank/wedatasphere/dss/appconn/sendemail/email/EmailSender.scala b/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/scala/com/webank/wedatasphere/dss/appconn/sendemail/email/EmailSender.scala new file mode 100644 index 000000000..053076442 --- /dev/null +++ b/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/scala/com/webank/wedatasphere/dss/appconn/sendemail/email/EmailSender.scala @@ -0,0 +1,30 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.sendemail.email + +import java.util.concurrent.Future + +import com.webank.wedatasphere.dss.appconn.sendemail.exception.EmailSendFailedException + +trait EmailSender { + + @throws(classOf[EmailSendFailedException]) + def send(email: Email): Unit + + def sendAsync(email: Email): Future[Unit] + +} diff --git a/sendemail-appjoint/sendemail-core/src/main/scala/com/webank/wedatasphere/dss/appjoint/sendemail/email/AbstractEmail.scala b/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/scala/com/webank/wedatasphere/dss/appconn/sendemail/email/domain/AbstractEmail.scala similarity index 83% rename from sendemail-appjoint/sendemail-core/src/main/scala/com/webank/wedatasphere/dss/appjoint/sendemail/email/AbstractEmail.scala rename to dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/scala/com/webank/wedatasphere/dss/appconn/sendemail/email/domain/AbstractEmail.scala index 277100807..0f5b9f14e 100644 --- a/sendemail-appjoint/sendemail-core/src/main/scala/com/webank/wedatasphere/dss/appjoint/sendemail/email/AbstractEmail.scala +++ b/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/scala/com/webank/wedatasphere/dss/appconn/sendemail/email/domain/AbstractEmail.scala @@ -1,8 +1,7 @@ /* * Copyright 2019 WeBank - * * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. + * you may not use this file except in compliance with the License. * You may obtain a copy of the License at * * http://www.apache.org/licenses/LICENSE-2.0 @@ -15,16 +14,12 @@ * */ -package com.webank.wedatasphere.dss.appjoint.sendemail.email +package com.webank.wedatasphere.dss.appconn.sendemail.email.domain -import com.webank.wedatasphere.dss.appjoint.sendemail.{Attachment, Email} -import com.webank.wedatasphere.dss.appjoint.sendemail.{Attachment, Email} +import com.webank.wedatasphere.dss.appconn.sendemail.email.Email import scala.collection.mutable.ArrayBuffer -/** - * Created by shanhuang on 2019/10/12. - */ class AbstractEmail extends Email { private var content: String = _ diff --git a/sendemail-appjoint/sendemail-core/src/main/scala/com/webank/wedatasphere/dss/appjoint/sendemail/Attachment.scala b/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/scala/com/webank/wedatasphere/dss/appconn/sendemail/email/domain/Attachment.scala similarity index 78% rename from sendemail-appjoint/sendemail-core/src/main/scala/com/webank/wedatasphere/dss/appjoint/sendemail/Attachment.scala rename to dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/scala/com/webank/wedatasphere/dss/appconn/sendemail/email/domain/Attachment.scala index 5b9c16e6b..dbea0da83 100644 --- a/sendemail-appjoint/sendemail-core/src/main/scala/com/webank/wedatasphere/dss/appjoint/sendemail/Attachment.scala +++ b/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/scala/com/webank/wedatasphere/dss/appconn/sendemail/email/domain/Attachment.scala @@ -1,8 +1,7 @@ /* * Copyright 2019 WeBank - * * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. + * you may not use this file except in compliance with the License. * You may obtain a copy of the License at * * http://www.apache.org/licenses/LICENSE-2.0 @@ -15,18 +14,13 @@ * */ -package com.webank.wedatasphere.dss.appjoint.sendemail +package com.webank.wedatasphere.dss.appconn.sendemail.email.domain import java.io.File -/** - * Created by shanhuang on 2019/10/12. - */ trait Attachment { - def getName: String def getBase64Str: String def getFile: File def getMediaType: String - } diff --git a/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/scala/com/webank/wedatasphere/dss/appconn/sendemail/email/domain/MultiContentEmail.scala b/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/scala/com/webank/wedatasphere/dss/appconn/sendemail/email/domain/MultiContentEmail.scala new file mode 100644 index 000000000..957de390a --- /dev/null +++ b/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/scala/com/webank/wedatasphere/dss/appconn/sendemail/email/domain/MultiContentEmail.scala @@ -0,0 +1,33 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.sendemail.email.domain + +import java.util + +import com.webank.wedatasphere.dss.appconn.sendemail.emailcontent.EmailContent + +import scala.collection.JavaConversions._ + +class MultiContentEmail extends AbstractEmail { + + private val emailContents = new util.ArrayList[EmailContent[_]]() + + def addEmailContent(emailContent: EmailContent[_]): Unit = emailContents.add(emailContent) + + def getEmailContents: Array[EmailContent[_]] = emailContents.toIterator.toArray + +} diff --git a/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/scala/com/webank/wedatasphere/dss/appconn/sendemail/email/domain/PngAttachment.scala b/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/scala/com/webank/wedatasphere/dss/appconn/sendemail/email/domain/PngAttachment.scala new file mode 100644 index 000000000..369a55cb0 --- /dev/null +++ b/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/scala/com/webank/wedatasphere/dss/appconn/sendemail/email/domain/PngAttachment.scala @@ -0,0 +1,31 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.sendemail.email.domain + +import java.io.File + +class PngAttachment(name: String, b64: String) extends Attachment { + + override def getName: String = name + + override def getBase64Str: String = b64 + + override def getFile: File = null //TODO write b64 to file + + override def getMediaType: String = "image/png" + +} \ No newline at end of file diff --git a/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/scala/com/webank/wedatasphere/dss/appconn/sendemail/email/generate/AbstractEmailGenerator.scala b/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/scala/com/webank/wedatasphere/dss/appconn/sendemail/email/generate/AbstractEmailGenerator.scala new file mode 100644 index 000000000..5d1c6b990 --- /dev/null +++ b/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/scala/com/webank/wedatasphere/dss/appconn/sendemail/email/generate/AbstractEmailGenerator.scala @@ -0,0 +1,73 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.sendemail.email.generate + +import com.webank.wedatasphere.dss.appconn.sendemail.email.{Email, EmailGenerator} +import com.webank.wedatasphere.dss.appconn.sendemail.email.domain.AbstractEmail +import com.webank.wedatasphere.dss.appconn.sendemail.exception.EmailSendFailedException +import com.webank.wedatasphere.dss.standard.app.development.listener.common.AsyncExecutionRequestRef +import com.webank.wedatasphere.dss.standard.app.development.listener.core.ExecutionRequestRefContext +import com.webank.wedatasphere.dss.standard.app.development.ref.ExecutionRequestRef +import com.webank.wedatasphere.linkis.common.utils.Logging + +trait AbstractEmailGenerator extends EmailGenerator with Logging{ + + protected def createEmail(): AbstractEmail + + override def generateEmail(requestRef: ExecutionRequestRef): Email = { + val email = createEmail() + generateEmailInfo(requestRef, email) + generateEmailContent(requestRef, email) + email + } + + protected def getRuntimeMap(requestRef: ExecutionRequestRef): java.util.Map[String, AnyRef] = + requestRef match { + case r: AsyncExecutionRequestRef => r.getExecutionRequestRefContext.getRuntimeMap + case _ => requestRef.getParameters + } + + protected def getExecutionRequestRefContext(requestRef: ExecutionRequestRef): ExecutionRequestRefContext = + requestRef match { + case r: AsyncExecutionRequestRef => r.getExecutionRequestRefContext + case _ => throw new EmailSendFailedException(80002, "ExecutionRequestRefContext is empty!") + } + + protected def generateEmailInfo(requestRef: ExecutionRequestRef, email: AbstractEmail): Unit = { + import scala.collection.JavaConversions._ + val runtimeMap = getRuntimeMap(requestRef) + runtimeMap foreach { + case (k, v) => logger.info(s"K is $k, V is $v") + } + val subject = if (runtimeMap.get("subject") != null) runtimeMap.get("subject").toString else "This is an email" + email.setSubject(subject) + val bcc = if (runtimeMap.get("bcc") != null) runtimeMap.get("bcc").toString else "" + email.setBcc(bcc) + val cc = if (runtimeMap.get("cc") != null) runtimeMap.get("cc").toString else "" + email.setCc(cc) + val from = if (runtimeMap.get("from") != null) runtimeMap.get("from").toString else + if(runtimeMap.get("wds.dss.workflow.submit.user") != null){ + runtimeMap.get("wds.dss.workflow.submit.user").toString + } else runtimeMap.get("user").toString + email.setFrom(from) + val to = if (runtimeMap.get("to") != null) runtimeMap.get("to").toString else "" + email.setTo(to) + } + + protected def generateEmailContent(requestRef: ExecutionRequestRef, email: AbstractEmail): Unit + +} diff --git a/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/scala/com/webank/wedatasphere/dss/appconn/sendemail/email/generate/MultiContentEmailGenerator.scala b/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/scala/com/webank/wedatasphere/dss/appconn/sendemail/email/generate/MultiContentEmailGenerator.scala new file mode 100644 index 000000000..4d79dd015 --- /dev/null +++ b/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/scala/com/webank/wedatasphere/dss/appconn/sendemail/email/generate/MultiContentEmailGenerator.scala @@ -0,0 +1,58 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.sendemail.email.generate + +import com.webank.wedatasphere.dss.appconn.sendemail.cs.EmailCSHelper +import com.webank.wedatasphere.dss.appconn.sendemail.email.domain.{AbstractEmail, MultiContentEmail} +import com.webank.wedatasphere.dss.appconn.sendemail.emailcontent.domain.PictureEmailContent +import com.webank.wedatasphere.dss.appconn.sendemail.exception.EmailSendFailedException +import com.webank.wedatasphere.dss.standard.app.development.ref.ExecutionRequestRef +import com.webank.wedatasphere.linkis.storage.resultset.ResultSetFactory + +class MultiContentEmailGenerator extends AbstractEmailGenerator { + + override protected def createEmail(): AbstractEmail = new MultiContentEmail + + override protected def generateEmailContent(requestRef: ExecutionRequestRef, email: AbstractEmail): Unit = email match { + case multiContentEmail: MultiContentEmail => + val runtimeMap = getRuntimeMap(requestRef) + val refContext = getExecutionRequestRefContext(requestRef) + runtimeMap.get("category") match { + case "node" => + val resultSetFactory = ResultSetFactory.getInstance + EmailCSHelper.getJobIds(refContext).foreach { jobId => + refContext.fetchLinkisJobResultSetPaths(jobId).foreach { fsPath => + val resultSet = resultSetFactory.getResultSetByPath(fsPath) + val emailContent = resultSet.resultSetType() match { + case ResultSetFactory.PICTURE_TYPE => new PictureEmailContent(fsPath) + case ResultSetFactory.HTML_TYPE => throw new EmailSendFailedException(80003 ,"html result set is not allowed")//new HtmlEmailContent(fsPath) + case ResultSetFactory.TABLE_TYPE => throw new EmailSendFailedException(80003 ,"table result set is not allowed")//new TableEmailContent(fsPath) + case ResultSetFactory.TEXT_TYPE => throw new EmailSendFailedException(80003 ,"text result set is not allowed")//new FileEmailContent(fsPath) + } + multiContentEmail.addEmailContent(emailContent) + } + } + case "file" => throw new EmailSendFailedException(80003 ,"file content is not allowed") //addContentEmail(c => new FileEmailContent(new FsPath(c))) + case "text" => throw new EmailSendFailedException(80003 ,"text content is not allowed")//addContentEmail(new TextEmailContent(_)) + case "link" => throw new EmailSendFailedException(80003 ,"link content is not allowed")//addContentEmail(new UrlEmailContent(_)) + } + } + + + + +} \ No newline at end of file diff --git a/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/scala/com/webank/wedatasphere/dss/appconn/sendemail/emailcontent/EmailContentGenerator.scala b/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/scala/com/webank/wedatasphere/dss/appconn/sendemail/emailcontent/EmailContentGenerator.scala new file mode 100644 index 000000000..a6235abd2 --- /dev/null +++ b/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/scala/com/webank/wedatasphere/dss/appconn/sendemail/emailcontent/EmailContentGenerator.scala @@ -0,0 +1,25 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.sendemail.emailcontent + +import com.webank.wedatasphere.dss.appconn.sendemail.email.Email + +trait EmailContentGenerator { + + def generate(email: Email): Unit + +} \ No newline at end of file diff --git a/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/scala/com/webank/wedatasphere/dss/appconn/sendemail/emailcontent/EmailContentParser.scala b/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/scala/com/webank/wedatasphere/dss/appconn/sendemail/emailcontent/EmailContentParser.scala new file mode 100644 index 000000000..a85d52269 --- /dev/null +++ b/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/scala/com/webank/wedatasphere/dss/appconn/sendemail/emailcontent/EmailContentParser.scala @@ -0,0 +1,25 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.sendemail.emailcontent + +import com.webank.wedatasphere.dss.appconn.sendemail.email.Email + +trait EmailContentParser { + + def parse(email: Email): Unit + +} diff --git a/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/scala/com/webank/wedatasphere/dss/appconn/sendemail/emailcontent/domain/ArrayEmailContent.scala b/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/scala/com/webank/wedatasphere/dss/appconn/sendemail/emailcontent/domain/ArrayEmailContent.scala new file mode 100644 index 000000000..2fc552dd8 --- /dev/null +++ b/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/scala/com/webank/wedatasphere/dss/appconn/sendemail/emailcontent/domain/ArrayEmailContent.scala @@ -0,0 +1,29 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.sendemail.emailcontent.domain + +import com.webank.wedatasphere.dss.appconn.sendemail.emailcontent.EmailContent + +class ArrayEmailContent extends EmailContent[Array[String]] { + + private var content: Array[String] = _ + + override def getContent: Array[String] = content + + override def setContent(content: Array[String]): Unit = this.content = content + +} \ No newline at end of file diff --git a/sendemail-appjoint/sendemail-core/src/main/scala/com/webank/wedatasphere/dss/appjoint/sendemail/emailcontent/FsPathStoreEmailContent.scala b/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/scala/com/webank/wedatasphere/dss/appconn/sendemail/emailcontent/domain/FsPathStoreEmailContent.scala similarity index 79% rename from sendemail-appjoint/sendemail-core/src/main/scala/com/webank/wedatasphere/dss/appjoint/sendemail/emailcontent/FsPathStoreEmailContent.scala rename to dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/scala/com/webank/wedatasphere/dss/appconn/sendemail/emailcontent/domain/FsPathStoreEmailContent.scala index 059912155..c392ec3fb 100644 --- a/sendemail-appjoint/sendemail-core/src/main/scala/com/webank/wedatasphere/dss/appjoint/sendemail/emailcontent/FsPathStoreEmailContent.scala +++ b/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/scala/com/webank/wedatasphere/dss/appconn/sendemail/emailcontent/domain/FsPathStoreEmailContent.scala @@ -1,8 +1,7 @@ /* * Copyright 2019 WeBank - * * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. + * you may not use this file except in compliance with the License. * You may obtain a copy of the License at * * http://www.apache.org/licenses/LICENSE-2.0 @@ -15,13 +14,10 @@ * */ -package com.webank.wedatasphere.dss.appjoint.sendemail.emailcontent +package com.webank.wedatasphere.dss.appconn.sendemail.emailcontent.domain import com.webank.wedatasphere.linkis.common.io.FsPath -/** - * Created by enjoyyin on 2019/10/13. - */ trait FsPathStoreEmailContent { private var fsPath: FsPath = _ diff --git a/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/scala/com/webank/wedatasphere/dss/appconn/sendemail/emailcontent/domain/StringEmailContent.scala b/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/scala/com/webank/wedatasphere/dss/appconn/sendemail/emailcontent/domain/StringEmailContent.scala new file mode 100644 index 000000000..e28ab0bc8 --- /dev/null +++ b/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/scala/com/webank/wedatasphere/dss/appconn/sendemail/emailcontent/domain/StringEmailContent.scala @@ -0,0 +1,29 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.sendemail.emailcontent.domain + +import com.webank.wedatasphere.dss.appconn.sendemail.emailcontent.EmailContent + +class StringEmailContent extends EmailContent[String] { + + private var content: String = _ + + override def getContent: String = content + + override def setContent(content: String): Unit = this.content = content + +} \ No newline at end of file diff --git a/sendemail-appjoint/sendemail-core/src/main/scala/com/webank/wedatasphere/dss/appjoint/sendemail/emailcontent/package.scala b/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/scala/com/webank/wedatasphere/dss/appconn/sendemail/emailcontent/domain/package.scala similarity index 85% rename from sendemail-appjoint/sendemail-core/src/main/scala/com/webank/wedatasphere/dss/appjoint/sendemail/emailcontent/package.scala rename to dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/scala/com/webank/wedatasphere/dss/appconn/sendemail/emailcontent/domain/package.scala index 525a1409e..91feb6317 100644 --- a/sendemail-appjoint/sendemail-core/src/main/scala/com/webank/wedatasphere/dss/appjoint/sendemail/emailcontent/package.scala +++ b/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/scala/com/webank/wedatasphere/dss/appconn/sendemail/emailcontent/domain/package.scala @@ -1,8 +1,7 @@ /* * Copyright 2019 WeBank - * * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. + * you may not use this file except in compliance with the License. * You may obtain a copy of the License at * * http://www.apache.org/licenses/LICENSE-2.0 @@ -15,16 +14,12 @@ * */ -package com.webank.wedatasphere.dss.appjoint.sendemail.emailcontent +package com.webank.wedatasphere.dss.appconn.sendemail.emailcontent.domain -import com.webank.wedatasphere.dss.appjoint.sendemail.emailcontent.StringEmailContent import com.webank.wedatasphere.linkis.common.io.FsPath import scala.beans.BeanProperty -/** - * Created by enjoyyin on 2019/10/13. - */ package object emailcontent { } diff --git a/sendemail-appjoint/sendemail-core/src/main/scala/com/webank/wedatasphere/dss/appjoint/sendemail/emailcontent/generator/AbstractEmailContentGenerator.scala b/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/scala/com/webank/wedatasphere/dss/appconn/sendemail/emailcontent/generator/AbstractEmailContentGenerator.scala similarity index 75% rename from sendemail-appjoint/sendemail-core/src/main/scala/com/webank/wedatasphere/dss/appjoint/sendemail/emailcontent/generator/AbstractEmailContentGenerator.scala rename to dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/scala/com/webank/wedatasphere/dss/appconn/sendemail/emailcontent/generator/AbstractEmailContentGenerator.scala index 53cf5a0fb..388a36d1c 100644 --- a/sendemail-appjoint/sendemail-core/src/main/scala/com/webank/wedatasphere/dss/appjoint/sendemail/emailcontent/generator/AbstractEmailContentGenerator.scala +++ b/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/scala/com/webank/wedatasphere/dss/appconn/sendemail/emailcontent/generator/AbstractEmailContentGenerator.scala @@ -1,8 +1,7 @@ /* * Copyright 2019 WeBank - * * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. + * you may not use this file except in compliance with the License. * You may obtain a copy of the License at * * http://www.apache.org/licenses/LICENSE-2.0 @@ -15,23 +14,17 @@ * */ -package com.webank.wedatasphere.dss.appjoint.sendemail.emailcontent.generator +package com.webank.wedatasphere.dss.appconn.sendemail.emailcontent.generator import java.text.SimpleDateFormat import java.util.{Calendar, Date} -import com.webank.wedatasphere.dss.appjoint.sendemail.{Email, EmailContentGenerator} -import com.webank.wedatasphere.dss.appjoint.sendemail.{Email, EmailContentGenerator} +import com.webank.wedatasphere.dss.appconn.sendemail.email.Email +import com.webank.wedatasphere.dss.appconn.sendemail.emailcontent.EmailContentGenerator + -/** - * Created by shanhuang on 2019/10/12. - */ trait AbstractEmailContentGenerator extends EmailContentGenerator { - /** - * 兼容Visualis老版的变量设置方式 - * @param email - */ protected def formatSubjectOfOldVersion(email: Email): Unit = { var title = email.getSubject if (title.contains("YYYY-MM-DD HH:MM:SS")) { @@ -53,7 +46,6 @@ trait AbstractEmailContentGenerator extends EmailContentGenerator { val timeStr = sdf.format(new Date) title = title + timeStr } - // if (!title.contains("试运行版")) title = "【试运行版】" + title email.setSubject(title) } diff --git a/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/scala/com/webank/wedatasphere/dss/appconn/sendemail/emailcontent/generator/MultiEmailContentGenerator.scala b/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/scala/com/webank/wedatasphere/dss/appconn/sendemail/emailcontent/generator/MultiEmailContentGenerator.scala new file mode 100644 index 000000000..4de225c6b --- /dev/null +++ b/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/scala/com/webank/wedatasphere/dss/appconn/sendemail/emailcontent/generator/MultiEmailContentGenerator.scala @@ -0,0 +1,48 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.sendemail.emailcontent.generator + +import com.webank.wedatasphere.dss.appconn.sendemail.email.Email +import com.webank.wedatasphere.dss.appconn.sendemail.email.domain.MultiContentEmail +import com.webank.wedatasphere.dss.appconn.sendemail.emailcontent.domain.{ArrayEmailContent, StringEmailContent} +import com.webank.wedatasphere.linkis.common.utils.Logging + + +class MultiEmailContentGenerator extends AbstractEmailContentGenerator with Logging { + + override def generate(email: Email): Unit = email match { + case multiContentEmail: MultiContentEmail => + formatSubjectOfOldVersion(email) + formatSubject(multiContentEmail) + formatContent(multiContentEmail) + } + + protected def formatContent(email: MultiContentEmail): Unit = { + val sb: StringBuilder = new StringBuilder("") + sb.append("") + email.getEmailContents.foreach { + case emailContent: ArrayEmailContent => + emailContent.getContent.foreach(content => sb.append("").append(content).append("")) + case emailContent: StringEmailContent => + sb.append("").append(emailContent.getContent).append("") + } + sb.append("") + sb.append("") + email.setContent(sb.toString) + } + +} \ No newline at end of file diff --git a/sendemail-appjoint/sendemail-core/src/main/scala/com/webank/wedatasphere/dss/appjoint/sendemail/emailcontent/parser/AbstractEmailContentParser.scala b/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/scala/com/webank/wedatasphere/dss/appconn/sendemail/emailcontent/parser/AbstractEmailContentParser.scala similarity index 75% rename from sendemail-appjoint/sendemail-core/src/main/scala/com/webank/wedatasphere/dss/appjoint/sendemail/emailcontent/parser/AbstractEmailContentParser.scala rename to dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/scala/com/webank/wedatasphere/dss/appconn/sendemail/emailcontent/parser/AbstractEmailContentParser.scala index 98249bcdf..05851d718 100644 --- a/sendemail-appjoint/sendemail-core/src/main/scala/com/webank/wedatasphere/dss/appjoint/sendemail/emailcontent/parser/AbstractEmailContentParser.scala +++ b/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/scala/com/webank/wedatasphere/dss/appconn/sendemail/emailcontent/parser/AbstractEmailContentParser.scala @@ -1,8 +1,7 @@ /* * Copyright 2019 WeBank - * * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. + * you may not use this file except in compliance with the License. * You may obtain a copy of the License at * * http://www.apache.org/licenses/LICENSE-2.0 @@ -15,16 +14,14 @@ * */ -package com.webank.wedatasphere.dss.appjoint.sendemail.emailcontent.parser +package com.webank.wedatasphere.dss.appconn.sendemail.emailcontent.parser import java.lang.reflect.{ParameterizedType, Type} -import com.webank.wedatasphere.dss.appjoint.sendemail.email.MultiContentEmail -import com.webank.wedatasphere.dss.appjoint.sendemail.emailcontent.FsPathStoreEmailContent -import com.webank.wedatasphere.dss.appjoint.sendemail.{Email, EmailContent, EmailContentParser} -import com.webank.wedatasphere.dss.appjoint.sendemail.email.MultiContentEmail -import com.webank.wedatasphere.dss.appjoint.sendemail.emailcontent.FsPathStoreEmailContent -import com.webank.wedatasphere.dss.appjoint.sendemail.{Email, EmailContent, EmailContentParser} +import com.webank.wedatasphere.dss.appconn.sendemail.email.Email +import com.webank.wedatasphere.dss.appconn.sendemail.email.domain.MultiContentEmail +import com.webank.wedatasphere.dss.appconn.sendemail.emailcontent.domain.FsPathStoreEmailContent +import com.webank.wedatasphere.dss.appconn.sendemail.emailcontent.{EmailContent, EmailContentParser} import com.webank.wedatasphere.linkis.common.io.resultset.ResultSetReader import com.webank.wedatasphere.linkis.common.io.{MetaData, Record} import com.webank.wedatasphere.linkis.common.utils.Utils @@ -32,9 +29,6 @@ import com.webank.wedatasphere.linkis.storage.LineRecord import com.webank.wedatasphere.linkis.storage.resultset.ResultSetReader import org.apache.commons.io.IOUtils -/** - * Created by shanhuang on 2019/10/12. - */ abstract class AbstractEmailContentParser[T] extends EmailContentParser { override def parse(email: Email): Unit = email match { diff --git a/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/scala/com/webank/wedatasphere/dss/appconn/sendemail/emailcontent/parser/FileEmailContentParser.scala b/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/scala/com/webank/wedatasphere/dss/appconn/sendemail/emailcontent/parser/FileEmailContentParser.scala new file mode 100644 index 000000000..91d3eefb3 --- /dev/null +++ b/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/scala/com/webank/wedatasphere/dss/appconn/sendemail/emailcontent/parser/FileEmailContentParser.scala @@ -0,0 +1,40 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.sendemail.emailcontent.parser + +import com.webank.wedatasphere.dss.appconn.sendemail.email.domain.MultiContentEmail +import com.webank.wedatasphere.dss.appconn.sendemail.emailcontent.domain.FileEmailContent +import com.webank.wedatasphere.linkis.common.utils.Utils +import com.webank.wedatasphere.linkis.storage.LineRecord +import org.apache.commons.io.IOUtils + +object FileEmailContentParser extends AbstractEmailContentParser[FileEmailContent] { + override protected def parseEmailContent(emailContent: FileEmailContent, + multiContentEmail: MultiContentEmail): Unit = { + val reader = getResultSetReader(emailContent) + val content = new StringBuilder + Utils.tryFinally{ + while(reader.hasNext) { + reader.getRecord match { + case lineRecord: LineRecord => + content.append(lineRecord.getLine).append("") + } + } + }(IOUtils.closeQuietly(reader)) + emailContent.setContent(content.toString()) + } +} \ No newline at end of file diff --git a/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/scala/com/webank/wedatasphere/dss/appconn/sendemail/emailcontent/parser/HtmlEmailContentParser.scala b/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/scala/com/webank/wedatasphere/dss/appconn/sendemail/emailcontent/parser/HtmlEmailContentParser.scala new file mode 100644 index 000000000..cab083f70 --- /dev/null +++ b/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/scala/com/webank/wedatasphere/dss/appconn/sendemail/emailcontent/parser/HtmlEmailContentParser.scala @@ -0,0 +1,27 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.sendemail.emailcontent.parser + +import com.webank.wedatasphere.dss.appconn.sendemail.email.domain.MultiContentEmail +import com.webank.wedatasphere.dss.appconn.sendemail.emailcontent.domain.HtmlEmailContent + +object HtmlEmailContentParser extends AbstractEmailContentParser[HtmlEmailContent] { + override protected def parseEmailContent(emailContent: HtmlEmailContent, + multiContentEmail: MultiContentEmail): Unit = { + getFirstLineRecord(emailContent).foreach(emailContent.setContent) + } +} diff --git a/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/scala/com/webank/wedatasphere/dss/appconn/sendemail/emailcontent/parser/PictureEmailContentParser.scala b/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/scala/com/webank/wedatasphere/dss/appconn/sendemail/emailcontent/parser/PictureEmailContentParser.scala new file mode 100644 index 000000000..1b771409e --- /dev/null +++ b/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/scala/com/webank/wedatasphere/dss/appconn/sendemail/emailcontent/parser/PictureEmailContentParser.scala @@ -0,0 +1,73 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.sendemail.emailcontent.parser + +import java.awt.image.BufferedImage +import java.io.{ByteArrayInputStream, ByteArrayOutputStream} +import java.util.{Base64, UUID} + +import com.webank.wedatasphere.dss.appconn.sendemail.email.domain.{AbstractEmail, MultiContentEmail, PngAttachment} +import com.webank.wedatasphere.dss.appconn.sendemail.emailcontent.domain.PictureEmailContent +import com.webank.wedatasphere.linkis.common.conf.Configuration +import javax.imageio.ImageIO +import org.apache.commons.codec.binary.Base64OutputStream +import com.webank.wedatasphere.dss.appconn.sendemail.conf.SendEmailAppConnConfiguration._ + +object PictureEmailContentParser extends AbstractEmailContentParser[PictureEmailContent] { + + override protected def parseEmailContent(emailContent: PictureEmailContent, + multiContentEmail: MultiContentEmail): Unit = { + getFirstLineRecord(emailContent).foreach { imageStr => + val decoder = Base64.getDecoder + val byteArr = decoder.decode(imageStr) + val inputStream = new ByteArrayInputStream(byteArr) + val image = ImageIO.read(inputStream) + val contents = generateImage(image, multiContentEmail) + emailContent.setContent(contents) + } + } + + protected def generateImage(bufferedImage: BufferedImage, email: AbstractEmail): Array[String] = { + val imageUUID: String = UUID.randomUUID.toString + val width: Int = bufferedImage.getWidth + val height: Int = bufferedImage.getHeight + val imagesCuts = if (height > EMAIL_IMAGE_HEIGHT.getValue) { + val numOfCut = Math.ceil(height.toDouble / EMAIL_IMAGE_HEIGHT.getValue).toInt + val realHeight = height / numOfCut + (0 until numOfCut).map(i => bufferedImage.getSubimage(0, i * realHeight, width, realHeight)).toArray + } else Array(bufferedImage) + imagesCuts.indices.map { index => + val image = imagesCuts(index) + val imageName = index + "_" + imageUUID + ".png" + val os = new ByteArrayOutputStream + val b64Stream = new Base64OutputStream(os) + ImageIO.write(image, "png", b64Stream) + val b64 = os.toString(Configuration.BDP_ENCODING.getValue) + email.addAttachment(new PngAttachment(imageName, b64)) + + var iHeight = image.getHeight + var iWidth = image.getWidth + + if (iWidth > EMAIL_IMAGE_WIDTH.getValue) { + iHeight = ((EMAIL_IMAGE_WIDTH.getValue.toDouble / iWidth.toDouble) * iHeight.toDouble).toInt + iWidth = EMAIL_IMAGE_WIDTH.getValue + } + s"""""" + }.toArray + } + +} diff --git a/sendemail-appjoint/sendemail-core/src/main/scala/com/webank/wedatasphere/dss/appjoint/sendemail/emailcontent/parser/TableEmailContentParser.scala b/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/scala/com/webank/wedatasphere/dss/appconn/sendemail/emailcontent/parser/TableEmailContentParser.scala similarity index 78% rename from sendemail-appjoint/sendemail-core/src/main/scala/com/webank/wedatasphere/dss/appjoint/sendemail/emailcontent/parser/TableEmailContentParser.scala rename to dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/scala/com/webank/wedatasphere/dss/appconn/sendemail/emailcontent/parser/TableEmailContentParser.scala index 3b832185e..fecabbe7a 100644 --- a/sendemail-appjoint/sendemail-core/src/main/scala/com/webank/wedatasphere/dss/appjoint/sendemail/emailcontent/parser/TableEmailContentParser.scala +++ b/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/scala/com/webank/wedatasphere/dss/appconn/sendemail/emailcontent/parser/TableEmailContentParser.scala @@ -1,8 +1,7 @@ /* * Copyright 2019 WeBank - * * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. + * you may not use this file except in compliance with the License. * You may obtain a copy of the License at * * http://www.apache.org/licenses/LICENSE-2.0 @@ -15,19 +14,14 @@ * */ -package com.webank.wedatasphere.dss.appjoint.sendemail.emailcontent.parser +package com.webank.wedatasphere.dss.appconn.sendemail.emailcontent.parser -import com.webank.wedatasphere.dss.appjoint.sendemail.email.MultiContentEmail -import com.webank.wedatasphere.dss.appjoint.sendemail.emailcontent.TableEmailContent +import com.webank.wedatasphere.dss.appconn.sendemail.email.domain.MultiContentEmail +import com.webank.wedatasphere.dss.appconn.sendemail.emailcontent.domain.TableEmailContent import com.webank.wedatasphere.linkis.common.utils.Utils import com.webank.wedatasphere.linkis.storage.resultset.table.{TableMetaData, TableRecord} import org.apache.commons.io.IOUtils import org.apache.commons.lang.StringUtils -import org.springframework.stereotype.Component - -/** - * Created by shanhuang on 2019/10/12. - */ object TableEmailContentParser extends AbstractEmailContentParser[TableEmailContent] { override protected def parseEmailContent(emailContent: TableEmailContent, @@ -39,9 +33,11 @@ object TableEmailContentParser extends AbstractEmailContentParser[TableEmailCont writeTableTH(tableMetaData, content) } Utils.tryFinally { - while(reader.hasNext) reader.getRecord match { - case tableRecord: TableRecord => - writeTableTR(tableRecord, content) + while(reader.hasNext) { + reader.getRecord match { + case tableRecord: TableRecord => + writeTableTR(tableRecord, content) + } } }(IOUtils.closeQuietly(reader)) emailContent.setContent(content.toString()) diff --git a/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/scala/com/webank/wedatasphere/dss/appconn/sendemail/hook/AbstractSendEmailRefExecutionHook.scala b/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/scala/com/webank/wedatasphere/dss/appconn/sendemail/hook/AbstractSendEmailRefExecutionHook.scala new file mode 100644 index 000000000..b146b5930 --- /dev/null +++ b/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/scala/com/webank/wedatasphere/dss/appconn/sendemail/hook/AbstractSendEmailRefExecutionHook.scala @@ -0,0 +1,33 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.sendemail.hook + +import com.webank.wedatasphere.dss.appconn.sendemail.email.Email +import com.webank.wedatasphere.dss.appconn.sendemail.exception.EmailSendFailedException +import com.webank.wedatasphere.dss.standard.app.development.listener.common.AsyncExecutionRequestRef +import com.webank.wedatasphere.dss.standard.app.development.listener.core.ExecutionRequestRefContext +import com.webank.wedatasphere.dss.standard.app.development.ref.ExecutionRequestRef + +abstract class AbstractSendEmailRefExecutionHook extends SendEmailRefExecutionHook { + + protected def getExecutionRequestRefContext(requestRef: ExecutionRequestRef): ExecutionRequestRefContext = requestRef match { + case async: AsyncExecutionRequestRef => async.getExecutionRequestRefContext + case _ => throw new EmailSendFailedException(80002, "ExecutionRequestRefContext is empty!") + } + + override def preSend(requestRef: ExecutionRequestRef, email: Email): Unit = {} +} diff --git a/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/scala/com/webank/wedatasphere/dss/appconn/sendemail/hook/SendEmailRefExecutionHook.scala b/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/scala/com/webank/wedatasphere/dss/appconn/sendemail/hook/SendEmailRefExecutionHook.scala new file mode 100644 index 000000000..3c2bd7c00 --- /dev/null +++ b/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/scala/com/webank/wedatasphere/dss/appconn/sendemail/hook/SendEmailRefExecutionHook.scala @@ -0,0 +1,31 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.sendemail.hook + +import com.webank.wedatasphere.dss.appconn.sendemail.email.Email +import com.webank.wedatasphere.dss.appconn.sendemail.exception.EmailSendFailedException +import com.webank.wedatasphere.dss.standard.app.development.ref.ExecutionRequestRef + + +trait SendEmailRefExecutionHook { + + @throws(classOf[EmailSendFailedException]) + def preGenerate(requestRef: ExecutionRequestRef): Unit + + def preSend(requestRef: ExecutionRequestRef, email: Email): Unit + +} diff --git a/visualis-appjoint/appjoint/src/main/scala/com/webank/wedatasphere/linkis/server/JSONUtils.scala b/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/scala/com/webank/wedatasphere/linkis/server/JSONUtils.scala similarity index 83% rename from visualis-appjoint/appjoint/src/main/scala/com/webank/wedatasphere/linkis/server/JSONUtils.scala rename to dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/scala/com/webank/wedatasphere/linkis/server/JSONUtils.scala index c5b348ff0..7267ae81d 100644 --- a/visualis-appjoint/appjoint/src/main/scala/com/webank/wedatasphere/linkis/server/JSONUtils.scala +++ b/dss-appconn/appconns/dss-sendemail-appconn/sendemail-appconn-core/src/main/scala/com/webank/wedatasphere/linkis/server/JSONUtils.scala @@ -1,8 +1,7 @@ /* * Copyright 2019 WeBank - * * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. + * you may not use this file except in compliance with the License. * You may obtain a copy of the License at * * http://www.apache.org/licenses/LICENSE-2.0 @@ -17,11 +16,9 @@ package com.webank.wedatasphere.linkis.server -/** - * Created by enjoyyin on 2019/10/12. - */ object JSONUtils { val gson = BDPJettyServerHelper.gson + val jackson = BDPJettyServerHelper.jacksonJson } diff --git a/dss-appconn/appconns/dss-visualis-appconn/pom.xml b/dss-appconn/appconns/dss-visualis-appconn/pom.xml new file mode 100644 index 000000000..fed552a4f --- /dev/null +++ b/dss-appconn/appconns/dss-visualis-appconn/pom.xml @@ -0,0 +1,207 @@ + + + + + + dss + com.webank.wedatasphere.dss + 1.0.0 + + 4.0.0 + + dss-visualis-appconn + + + + com.webank.wedatasphere.dss + dss-project-plugin + ${dss.version} + + + com.webank.wedatasphere.dss + dss-appconn-core + ${dss.version} + + + com.webank.wedatasphere.dss + spring-origin-dss-project-plugin + ${dss.version} + + + + com.webank.wedatasphere.dss + dss-structure-integration-standard + ${dss.version} + + + + com.webank.wedatasphere.dss + dss-development-process-standard + ${dss.version} + + + com.webank.wedatasphere.dss + dss-development-process-standard-execution + ${dss.version} + + + + com.webank.wedatasphere.linkis + linkis-module + ${linkis.version} + provided + + + httpclient + org.apache.httpcomponents + + + true + + + com.webank.wedatasphere.linkis + linkis-cs-common + ${linkis.version} + compile + + + linkis-bml-client + + + gson + com.google.code.gson + + + com.webank.wedatasphere.linkis + ${linkis.version} + provided + true + + + + com.webank.wedatasphere.linkis + linkis-httpclient + ${linkis.version} + + + linkis-common + com.webank.wedatasphere.linkis + + + json4s-jackson_2.11 + org.json4s + + + + + + com.webank.wedatasphere.linkis + linkis-storage + ${linkis.version} + provided + + + linkis-common + com.webank.wedatasphere.linkis + + + + + + com.webank.wedatasphere.dss + dss-common + ${dss.version} + provided + + + + + + + + org.apache.maven.plugins + maven-deploy-plugin + + + + net.alchim31.maven + scala-maven-plugin + + + org.apache.maven.plugins + maven-jar-plugin + + + org.apache.maven.plugins + maven-assembly-plugin + 2.3 + false + + + make-assembly + package + + single + + + + src/main/assembly/distribution.xml + + + + + + false + out + false + false + + src/main/assembly/distribution.xml + + + + + org.apache.maven.plugins + maven-gpg-plugin + + true + + + + + + src/main/java + + **/*.xml + + + + src/main/resources + + **/*.properties + **/application.yml + **/bootstrap.yml + **/log4j2.xml + + + + + + + \ No newline at end of file diff --git a/dss-appconn/appconns/dss-visualis-appconn/src/main/assembly/distribution.xml b/dss-appconn/appconns/dss-visualis-appconn/src/main/assembly/distribution.xml new file mode 100644 index 000000000..89f053c69 --- /dev/null +++ b/dss-appconn/appconns/dss-visualis-appconn/src/main/assembly/distribution.xml @@ -0,0 +1,74 @@ + + + + dss-visualis-appconn + + dir + + true + visualis + + + + + + lib + true + true + false + true + true + + + + + + + + ${basedir}/conf + + * + + 0777 + conf + unix + + + . + + */** + + logs + + + + ${basedir}/src/main/resources + + init.sql + + 0777 + db + + + + + + + diff --git a/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/VisualisAppConn.java b/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/VisualisAppConn.java new file mode 100644 index 000000000..976301827 --- /dev/null +++ b/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/VisualisAppConn.java @@ -0,0 +1,48 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.visualis; + +import com.webank.wedatasphere.dss.appconn.core.ext.ThirdlyAppConn; +import com.webank.wedatasphere.dss.appconn.core.impl.AbstractOnlySSOAppConn; +import com.webank.wedatasphere.dss.standard.app.development.standard.DevelopmentIntegrationStandard; +import com.webank.wedatasphere.dss.standard.app.structure.StructureIntegrationStandard; +import com.webank.wedatasphere.linkis.common.conf.CommonVars; + +public class VisualisAppConn extends AbstractOnlySSOAppConn implements ThirdlyAppConn { + + public static final String VISUALIS_APPCONN_NAME = CommonVars.apply("wds.dss.appconn.visualis.name", "Visualis").getValue(); + + private VisualisDevelopmentIntegrationStandard developmentIntegrationStandard; + private VisualisStructureIntegrationStandard structureIntegrationStandard; + + @Override + protected void initialize() { + structureIntegrationStandard = new VisualisStructureIntegrationStandard(); + developmentIntegrationStandard = new VisualisDevelopmentIntegrationStandard(); + } + + @Override + public StructureIntegrationStandard getOrCreateStructureStandard() { + return structureIntegrationStandard; + } + + @Override + public DevelopmentIntegrationStandard getOrCreateDevelopmentStandard() { + return developmentIntegrationStandard; + } + +} diff --git a/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/VisualisDevelopmentIntegrationStandard.java b/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/VisualisDevelopmentIntegrationStandard.java new file mode 100644 index 000000000..5065cb5e8 --- /dev/null +++ b/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/VisualisDevelopmentIntegrationStandard.java @@ -0,0 +1,51 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.visualis; + +import com.webank.wedatasphere.dss.appconn.visualis.execution.VisualisExecutionService; +import com.webank.wedatasphere.dss.appconn.visualis.service.*; +import com.webank.wedatasphere.dss.standard.app.development.service.*; +import com.webank.wedatasphere.dss.standard.app.development.standard.AbstractDevelopmentIntegrationStandard; + +public class VisualisDevelopmentIntegrationStandard extends AbstractDevelopmentIntegrationStandard { + + @Override + protected RefCRUDService createRefCRUDService() { + return new VisualisCRUDService(); + } + + @Override + protected RefExecutionService createRefExecutionService() { + return new VisualisExecutionService(); + } + + @Override + protected RefExportService createRefExportService() { + return new VisualisRefExportService(); + } + + @Override + protected RefImportService createRefImportService() { + return new VisualisRefImportService(); + } + + @Override + protected RefQueryService createRefQueryService() { + return new VisualisQueryService(); + } + +} diff --git a/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/VisualisStructureIntegrationStandard.java b/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/VisualisStructureIntegrationStandard.java new file mode 100644 index 000000000..c35fb1357 --- /dev/null +++ b/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/VisualisStructureIntegrationStandard.java @@ -0,0 +1,30 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.visualis; + +import com.webank.wedatasphere.dss.appconn.visualis.project.VisualisProjectService; +import com.webank.wedatasphere.dss.standard.app.structure.AbstractStructureIntegrationStandard; +import com.webank.wedatasphere.dss.standard.app.structure.project.ProjectService; + + +public class VisualisStructureIntegrationStandard extends AbstractStructureIntegrationStandard { + + @Override + protected ProjectService createProjectService() { + return new VisualisProjectService(); + } +} diff --git a/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/execution/VisualisCompletedExecutionResponseRef.java b/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/execution/VisualisCompletedExecutionResponseRef.java new file mode 100644 index 000000000..6442fca49 --- /dev/null +++ b/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/execution/VisualisCompletedExecutionResponseRef.java @@ -0,0 +1,47 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.visualis.execution; + +import com.webank.wedatasphere.dss.standard.app.development.listener.common.CompletedExecutionResponseRef; + +import java.util.Map; + +public class VisualisCompletedExecutionResponseRef extends CompletedExecutionResponseRef { + + public VisualisCompletedExecutionResponseRef(int status, String errorMessage){ + super(status); + this.errorMsg = errorMessage; + } + + public VisualisCompletedExecutionResponseRef(int status) { + super(status); + } + + public VisualisCompletedExecutionResponseRef(String responseBody, int status) { + super(responseBody, status); + } + + @Override + public Map toMap() { + return null; + } + + @Override + public String getErrorMsg() { + return this.errorMsg; + } +} diff --git a/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/execution/VisualisExecutionService.java b/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/execution/VisualisExecutionService.java new file mode 100644 index 000000000..0970daec6 --- /dev/null +++ b/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/execution/VisualisExecutionService.java @@ -0,0 +1,30 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.visualis.execution; + +import com.webank.wedatasphere.dss.standard.app.development.operation.RefExecutionOperation; +import com.webank.wedatasphere.dss.standard.app.development.service.AbstractRefExecutionService; + +public class VisualisExecutionService extends AbstractRefExecutionService { + + @Override + public RefExecutionOperation createRefExecutionOperation() { + VisualisRefExecutionOperation visualisRefExecutionOperation = new VisualisRefExecutionOperation(this); + return visualisRefExecutionOperation; + } + +} diff --git a/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/execution/VisualisRefExecutionOperation.java b/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/execution/VisualisRefExecutionOperation.java new file mode 100644 index 000000000..0b0d1337c --- /dev/null +++ b/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/execution/VisualisRefExecutionOperation.java @@ -0,0 +1,212 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.visualis.execution; + +import com.google.common.collect.Lists; +import com.webank.wedatasphere.dss.appconn.visualis.VisualisAppConn; +import com.webank.wedatasphere.dss.appconn.visualis.model.WidgetResultData; +import com.webank.wedatasphere.dss.appconn.visualis.ref.VisualisCommonResponseRef; +import com.webank.wedatasphere.dss.appconn.visualis.utils.NumberUtils; +import com.webank.wedatasphere.dss.appconn.visualis.utils.URLUtils; +import com.webank.wedatasphere.dss.appconn.visualis.utils.VisualisDownloadAction; +import com.webank.wedatasphere.dss.standard.app.development.service.DevelopmentService; +import com.webank.wedatasphere.dss.standard.app.development.ref.ExecutionRequestRef; +import com.webank.wedatasphere.dss.standard.app.development.operation.RefExecutionOperation; +import com.webank.wedatasphere.dss.standard.app.development.listener.common.AsyncExecutionRequestRef; +import com.webank.wedatasphere.dss.standard.app.sso.builder.SSOUrlBuilderOperation; +import com.webank.wedatasphere.dss.standard.app.sso.plugin.SSOIntegrationConf; +import com.webank.wedatasphere.dss.standard.app.sso.request.SSORequestOperation; +import com.webank.wedatasphere.dss.standard.common.entity.ref.ResponseRef; +import com.webank.wedatasphere.dss.standard.common.exception.operation.ExternalOperationFailedException; +import com.webank.wedatasphere.linkis.common.io.resultset.ResultSetWriter; +import com.webank.wedatasphere.linkis.httpclient.request.HttpAction; +import com.webank.wedatasphere.linkis.httpclient.response.HttpResult; +import com.webank.wedatasphere.linkis.server.BDPJettyServerHelper; +import com.webank.wedatasphere.linkis.server.conf.ServerConfiguration; +import com.webank.wedatasphere.linkis.storage.LineMetaData; +import com.webank.wedatasphere.linkis.storage.LineRecord; +import com.webank.wedatasphere.linkis.storage.domain.Column; +import com.webank.wedatasphere.linkis.storage.domain.DataType; +import com.webank.wedatasphere.linkis.storage.resultset.table.TableMetaData; +import com.webank.wedatasphere.linkis.storage.resultset.table.TableRecord; +import org.apache.commons.io.IOUtils; +import org.apache.commons.lang3.StringUtils; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; + +import java.io.ByteArrayOutputStream; +import java.util.Base64; +import java.util.List; +import java.util.Map; + +public class VisualisRefExecutionOperation implements RefExecutionOperation { + + private final static Logger logger = LoggerFactory.getLogger(VisualisRefExecutionOperation.class); + DevelopmentService developmentService; + private SSORequestOperation ssoRequestOperation; + + public VisualisRefExecutionOperation(DevelopmentService service) { + this.developmentService = service; + this.ssoRequestOperation = this.developmentService.getSSORequestService().createSSORequestOperation(getAppName()); + } + + private String getAppName() { + return VisualisAppConn.VISUALIS_APPCONN_NAME; + } + + @Override + public ResponseRef execute(ExecutionRequestRef ref) throws ExternalOperationFailedException { + AsyncExecutionRequestRef asyncExecutionRequestRef = (AsyncExecutionRequestRef) ref; + String nodeType = asyncExecutionRequestRef.getExecutionRequestRefContext().getRuntimeMap().get("nodeType").toString(); + if("visualis.widget".equalsIgnoreCase(nodeType)){ + return executeWidget(asyncExecutionRequestRef); + } else if("visualis.display".equalsIgnoreCase(nodeType)){ + return executePreview(asyncExecutionRequestRef, + URLUtils.getUrl(getBaseUrl(), URLUtils.DISPLAY_PREVIEW_URL_FORMAT, getId(asyncExecutionRequestRef)), + URLUtils.getUrl(getBaseUrl(), URLUtils.DISPLAY_METADATA_URL_FORMAT, getId(asyncExecutionRequestRef))); + } else if("visualis.dashboard".equalsIgnoreCase(nodeType)){ + return executePreview(asyncExecutionRequestRef, + URLUtils.getUrl(getBaseUrl(), URLUtils.DASHBOARD_PREVIEW_URL_FORMAT, getId(asyncExecutionRequestRef)), + URLUtils.getUrl(getBaseUrl(), URLUtils.DASHBOARD_METADATA_URL_FORMAT, getId(asyncExecutionRequestRef))); + } else { + throw new ExternalOperationFailedException(90177, "Unknown task type " + nodeType, null); + } + } + + + private ResponseRef executeWidget(AsyncExecutionRequestRef ref) throws ExternalOperationFailedException { + String url = URLUtils.getUrl(getBaseUrl(), URLUtils.WIDGET_DATA_URL_FORMAT, getId(ref)); + ref.getExecutionRequestRefContext().appendLog("Ready to get result set from " + url); + List columns = Lists.newArrayList(); + List tableRecords = Lists.newArrayList(); + VisualisDownloadAction visualisDownloadAction = new VisualisDownloadAction(); + visualisDownloadAction.setUser(getUser(ref)); + SSOUrlBuilderOperation ssoUrlBuilderOperation = ref.getWorkspace().getSSOUrlBuilderOperation().copy(); + ssoUrlBuilderOperation.setAppName(getAppName()); + ssoUrlBuilderOperation.setReqUrl(url); + ssoUrlBuilderOperation.setWorkspace(ref.getWorkspace().getWorkspaceName()); + try{ + visualisDownloadAction.setURL(ssoUrlBuilderOperation.getBuiltUrl()); + HttpResult httpResult = this.ssoRequestOperation.requestWithSSO(ssoUrlBuilderOperation, visualisDownloadAction); + WidgetResultData responseData = BDPJettyServerHelper.gson().fromJson(IOUtils.toString(visualisDownloadAction.getInputStream()), WidgetResultData.class); + if(responseData.getData().getColumns().isEmpty()){ + ref.getExecutionRequestRefContext().appendLog("Cannot execute an empty Widget!"); + throw new ExternalOperationFailedException(90176, "Cannot execute an empty Widget!", null); + } + for (WidgetResultData.Column columnData : responseData.getData().getColumns()) { + columns.add(new Column(columnData.getName(), DataType.toDataType(columnData.getType().toLowerCase()), "")); + } + ResultSetWriter resultSetWriter = ref.getExecutionRequestRefContext().createTableResultSetWriter(); + resultSetWriter.addMetaData(new TableMetaData(columns.toArray(new Column[0]))); + for (Map recordMap : responseData.getData().getResultList()) { + resultSetWriter.addRecord(new TableRecord(recordMap.values().toArray())); + } + resultSetWriter.flush(); + IOUtils.closeQuietly(resultSetWriter); + ref.getExecutionRequestRefContext().sendResultSet(resultSetWriter); + } catch (Throwable e){ + ref.getExecutionRequestRefContext().appendLog("Failed to debug Widget url " + url); + ref.getExecutionRequestRefContext().appendLog(e.getMessage()); + logger.error("executeWidget error:",e); + throw new ExternalOperationFailedException(90176, "Failed to debug Widget", e); + } finally { + IOUtils.closeQuietly(visualisDownloadAction.getInputStream()); + } + return new VisualisCompletedExecutionResponseRef(200); + } + + private ResponseRef executePreview(AsyncExecutionRequestRef ref, String previewUrl, String metaUrl) throws ExternalOperationFailedException { + ref.getExecutionRequestRefContext().appendLog("Ready to get result set from " + previewUrl); + VisualisDownloadAction previewDownloadAction = new VisualisDownloadAction(); + previewDownloadAction.setUser(getUser(ref)); + + VisualisDownloadAction metadataDownloadAction = new VisualisDownloadAction(); + metadataDownloadAction.setUser(getUser(ref)); + + try{ + logger.info("got workspace" + ref.getWorkspace()); + SSOUrlBuilderOperation ssoUrlBuilderOperation = ref.getWorkspace().getSSOUrlBuilderOperation().copy(); + ssoUrlBuilderOperation.setAppName(getAppName()); + ssoUrlBuilderOperation.setAppName(getAppName()); + ssoUrlBuilderOperation.setReqUrl(previewUrl); + ssoUrlBuilderOperation.setWorkspace(ref.getWorkspace().getWorkspaceName()); + logger.info("got getSSOUrlBuilderOperation:" + SSOIntegrationConf.gson().toJson(ssoUrlBuilderOperation)); + logger.info("got getSSOUrlBuilderOperation built url:" + ssoUrlBuilderOperation.getBuiltUrl()); + previewDownloadAction.setURL(ssoUrlBuilderOperation.getBuiltUrl()); + HttpResult previewResult = this.ssoRequestOperation.requestWithSSO(ssoUrlBuilderOperation, previewDownloadAction); + ByteArrayOutputStream os = new ByteArrayOutputStream(); + IOUtils.copy(previewDownloadAction.getInputStream(), os); + String response = new String(Base64.getEncoder().encode(os.toByteArray())); + + SSOUrlBuilderOperation ssoUrlBuilderOperationMeta = ref.getWorkspace().getSSOUrlBuilderOperation().copy(); + ssoUrlBuilderOperationMeta.setAppName(getAppName()); + ssoUrlBuilderOperationMeta.setAppName(getAppName()); + ssoUrlBuilderOperationMeta.setReqUrl(metaUrl); + ssoUrlBuilderOperationMeta.setWorkspace(ref.getWorkspace().getWorkspaceName()); + metadataDownloadAction.setURL(ssoUrlBuilderOperationMeta.getBuiltUrl()); + HttpResult metaResult = this.ssoRequestOperation.requestWithSSO(ssoUrlBuilderOperationMeta, metadataDownloadAction); + String metadata = StringUtils.chomp(IOUtils.toString(metadataDownloadAction.getInputStream(), ServerConfiguration.BDP_SERVER_ENCODING().getValue())); + ResultSetWriter resultSetWriter = ref.getExecutionRequestRefContext().createPictureResultSetWriter(); + resultSetWriter.addMetaData(new LineMetaData(metadata)); + resultSetWriter.addRecord(new LineRecord(response)); + resultSetWriter.flush(); + IOUtils.closeQuietly(resultSetWriter); + ref.getExecutionRequestRefContext().sendResultSet(resultSetWriter); + } catch (Throwable e){ + ref.getExecutionRequestRefContext().appendLog("Failed to debug Display url " + previewUrl); + logger.error(e.getMessage(), e); + throw new ExternalOperationFailedException(90176, "Failed to debug Display", e); + } finally { + IOUtils.closeQuietly(previewDownloadAction.getInputStream()); + IOUtils.closeQuietly(metadataDownloadAction.getInputStream()); + } + return new VisualisCompletedExecutionResponseRef(200); + } + + private String getUser(AsyncExecutionRequestRef requestRef) { + return requestRef.getExecutionRequestRefContext().getRuntimeMap().get("wds.dss.workflow.submit.user").toString(); + } + + private String getId(AsyncExecutionRequestRef requestRef) { + try { + String executionContent = BDPJettyServerHelper.jacksonJson().writeValueAsString(requestRef.getJobContent()); + String nodeType = requestRef.getExecutionRequestRefContext().getRuntimeMap().get("nodeType").toString(); + if("visualis.display".equalsIgnoreCase(nodeType)){ + VisualisCommonResponseRef displayCreateResponseRef = new VisualisCommonResponseRef(executionContent); + return NumberUtils.parseDoubleString(displayCreateResponseRef.getDisplayId()); + } else if("visualis.dashboard".equalsIgnoreCase(nodeType)){ + VisualisCommonResponseRef dashboardCreateResponseRef = new VisualisCommonResponseRef(executionContent); + return NumberUtils.parseDoubleString(dashboardCreateResponseRef.getDashboardId()); + } else if ("visualis.widget".equalsIgnoreCase(nodeType)){ + VisualisCommonResponseRef widgetCreateResponseRef = new VisualisCommonResponseRef(executionContent); + return NumberUtils.parseDoubleString(widgetCreateResponseRef.getWidgetId()); + } + } catch (Exception e) { + logger.error("failed to parse jobContent", e); + } + return null; + } + + @Override + public void setDevelopmentService(DevelopmentService service) { + this.developmentService = service; + } + + private String getBaseUrl(){ + return developmentService.getAppInstance().getBaseUrl(); + } +} diff --git a/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/model/HttpResponseModel.java b/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/model/HttpResponseModel.java new file mode 100644 index 000000000..15e7a7baa --- /dev/null +++ b/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/model/HttpResponseModel.java @@ -0,0 +1,47 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.visualis.model; + +public abstract class HttpResponseModel { + private String method; + private int status; + private String message; + + public String getMethod() { + return method; + } + + public void setMethod(String method) { + this.method = method; + } + + public int getStatus() { + return status; + } + + public void setStatus(int status) { + this.status = status; + } + + public String getMessage() { + return message; + } + + public void setMessage(String message) { + this.message = message; + } +} diff --git a/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/model/VisualisDeleteAction.java b/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/model/VisualisDeleteAction.java new file mode 100644 index 000000000..2755d9a2e --- /dev/null +++ b/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/model/VisualisDeleteAction.java @@ -0,0 +1,45 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.visualis.model; + +import com.webank.wedatasphere.linkis.httpclient.request.DeleteAction; +import com.webank.wedatasphere.linkis.httpclient.request.UserAction; + +public class VisualisDeleteAction extends DeleteAction implements UserAction { + + String url; + String user; + + @Override + public String getURL() { + return url; + } + + public void setUrl(String url) { + this.url = url; + } + + @Override + public void setUser(String user) { + this.user = user; + } + + @Override + public String getUser() { + return user; + } +} diff --git a/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/model/VisualisGetAction.java b/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/model/VisualisGetAction.java new file mode 100644 index 000000000..bc22aaa25 --- /dev/null +++ b/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/model/VisualisGetAction.java @@ -0,0 +1,45 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.visualis.model; + +import com.webank.wedatasphere.linkis.httpclient.request.GetAction; +import com.webank.wedatasphere.linkis.httpclient.request.UserAction; + +public class VisualisGetAction extends GetAction implements UserAction { + + String url; + String user; + + @Override + public String getURL() { + return url; + } + + public void setUrl(String url) { + this.url = url; + } + + @Override + public void setUser(String user) { + this.user = user; + } + + @Override + public String getUser() { + return user; + } +} diff --git a/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/model/VisualisPostAction.java b/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/model/VisualisPostAction.java new file mode 100644 index 000000000..fdc7c65b6 --- /dev/null +++ b/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/model/VisualisPostAction.java @@ -0,0 +1,60 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.visualis.model; + +import com.fasterxml.jackson.core.JsonProcessingException; +import com.webank.wedatasphere.linkis.httpclient.request.POSTAction; +import com.webank.wedatasphere.linkis.httpclient.request.UserAction; +import com.webank.wedatasphere.linkis.server.BDPJettyServerHelper; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; + +public class VisualisPostAction extends POSTAction implements UserAction { + + private static final Logger LOGGER = LoggerFactory.getLogger(VisualisPostAction.class); + String url; + String user; + + @Override + public String getRequestPayload() { + try { + return BDPJettyServerHelper.jacksonJson().writeValueAsString(getRequestPayloads()); + } catch (JsonProcessingException e) { + LOGGER.error("failed to covert {} to a string", getRequestPayloads(), e); + return ""; + } + } + + @Override + public String getURL() { + return url; + } + + public void setUrl(String url) { + this.url = url; + } + + @Override + public void setUser(String user) { + this.user = user; + } + + @Override + public String getUser() { + return user; + } +} diff --git a/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/model/VisualisPutAction.java b/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/model/VisualisPutAction.java new file mode 100644 index 000000000..40ee9bee7 --- /dev/null +++ b/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/model/VisualisPutAction.java @@ -0,0 +1,60 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.visualis.model; + +import com.fasterxml.jackson.core.JsonProcessingException; +import com.webank.wedatasphere.linkis.httpclient.request.PutAction; +import com.webank.wedatasphere.linkis.httpclient.request.UserAction; +import com.webank.wedatasphere.linkis.server.BDPJettyServerHelper; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; + +public class VisualisPutAction extends PutAction implements UserAction { + + private static final Logger LOGGER = LoggerFactory.getLogger(VisualisPutAction.class); + String url; + String user; + + @Override + public String getRequestPayload() { + try { + return BDPJettyServerHelper.jacksonJson().writeValueAsString(getRequestPayloads()); + } catch (JsonProcessingException e) { + LOGGER.error("failed to covert {} to a string", getRequestPayloads(), e); + return ""; + } + } + + @Override + public String getURL() { + return url; + } + + public void setUrl(String url) { + this.url = url; + } + + @Override + public void setUser(String user) { + this.user = user; + } + + @Override + public String getUser() { + return user; + } +} diff --git a/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/model/WidgetResultData.java b/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/model/WidgetResultData.java new file mode 100644 index 000000000..405c69eec --- /dev/null +++ b/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/model/WidgetResultData.java @@ -0,0 +1,76 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.visualis.model; + +import java.util.List; +import java.util.Map; + +public class WidgetResultData extends HttpResponseModel{ + public static class Column{ + private String name; + private String type; + + public String getName() { + return name; + } + + public void setName(String name) { + this.name = name; + } + + public String getType() { + return type; + } + + public void setType(String type) { + this.type = type; + } + } + + + public static class Data{ + private List columns; + private List> resultList; + + public List getColumns() { + return columns; + } + + public void setColumns(List columns) { + this.columns = columns; + } + + public List> getResultList() { + return resultList; + } + + public void setResultList(List> resultList) { + this.resultList = resultList; + } + } + + private Data data; + + public Data getData() { + return data; + } + + public void setData(Data data) { + this.data = data; + } +} + diff --git a/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/operation/VisualisRefCreationOperation.java b/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/operation/VisualisRefCreationOperation.java new file mode 100644 index 000000000..97b7f8d0e --- /dev/null +++ b/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/operation/VisualisRefCreationOperation.java @@ -0,0 +1,249 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.visualis.operation; + +import com.google.common.collect.Maps; +import com.google.gson.JsonArray; +import com.google.gson.JsonObject; +import com.google.gson.JsonParser; +import com.google.gson.reflect.TypeToken; +import com.webank.wedatasphere.dss.appconn.visualis.VisualisAppConn; +import com.webank.wedatasphere.dss.appconn.visualis.model.VisualisPostAction; +import com.webank.wedatasphere.dss.appconn.visualis.ref.*; +import com.webank.wedatasphere.dss.appconn.visualis.utils.URLUtils; +import com.webank.wedatasphere.dss.common.entity.node.DSSNode; +import com.webank.wedatasphere.dss.common.entity.node.DSSNodeDefault; +import com.webank.wedatasphere.dss.common.utils.DSSCommonUtils; +import com.webank.wedatasphere.dss.appconn.visualis.ref.NodeUpdateCSRequestRefImpl; +import com.webank.wedatasphere.dss.standard.app.development.service.DevelopmentService; +import com.webank.wedatasphere.dss.standard.app.development.ref.NodeRequestRef; +import com.webank.wedatasphere.dss.standard.app.development.ref.CreateRequestRef; +import com.webank.wedatasphere.dss.standard.app.development.operation.RefCreationOperation; +import com.webank.wedatasphere.dss.standard.app.sso.builder.SSOUrlBuilderOperation; +import com.webank.wedatasphere.dss.standard.app.sso.request.SSORequestOperation; +import com.webank.wedatasphere.dss.standard.common.entity.ref.ResponseRef; +import com.webank.wedatasphere.dss.standard.common.exception.operation.ExternalOperationFailedException; +import com.webank.wedatasphere.linkis.cs.common.utils.CSCommonUtils; +import com.webank.wedatasphere.linkis.httpclient.response.HttpResult; +import com.webank.wedatasphere.linkis.server.BDPJettyServerHelper; +import org.apache.commons.lang.StringUtils; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; + +import java.util.List; +import java.util.Map; + +public class VisualisRefCreationOperation implements RefCreationOperation { + private final static Logger logger = LoggerFactory.getLogger(VisualisRefCreationOperation.class); + + DevelopmentService developmentService; + private SSORequestOperation ssoRequestOperation; + + public VisualisRefCreationOperation(DevelopmentService service){ + this.developmentService = service; + this.ssoRequestOperation = this.developmentService.getSSORequestService().createSSORequestOperation(getAppName()); + } + + + private String getAppName() { + return VisualisAppConn.VISUALIS_APPCONN_NAME; + } + + @Override + public ResponseRef createRef(CreateRequestRef requestRef) throws ExternalOperationFailedException { + NodeRequestRef visualisCreateRequestRef = (NodeRequestRef) requestRef; + requestRef.setParameter("projectId", visualisCreateRequestRef.getProjectId()); + if("linkis.appconn.visualis.widget".equalsIgnoreCase(visualisCreateRequestRef.getNodeType())){ + return sendWidgetRequest(visualisCreateRequestRef); + } else if("linkis.appconn.visualis.display".equalsIgnoreCase(visualisCreateRequestRef.getNodeType())){ + return sendDisplayRequest(visualisCreateRequestRef); + }else if("linkis.appconn.visualis.dashboard".equalsIgnoreCase(visualisCreateRequestRef.getNodeType())){ + return sendDashboardRequest(visualisCreateRequestRef); + } else { + throw new ExternalOperationFailedException(90177, "Unknown task type " + visualisCreateRequestRef.getNodeType(), null); + } + } + + private ResponseRef sendWidgetRequest(NodeRequestRef requestRef) throws ExternalOperationFailedException { + String url = getBaseUrl() + URLUtils.widgetUrl; + VisualisPostAction visualisPostAction = new VisualisPostAction(); + visualisPostAction.setUser(requestRef.getUserName()); + visualisPostAction.addRequestPayload("widgetName", requestRef.getName()); + visualisPostAction.addRequestPayload("projectId", requestRef.getParameter("projectId")); + visualisPostAction.addRequestPayload(CSCommonUtils.CONTEXT_ID_STR, requestRef.getJobContent().get(CSCommonUtils.CONTEXT_ID_STR)); + if(requestRef.getJobContent().get("bindViewKey") != null){ + String viewNodeName = requestRef.getJobContent().get("bindViewKey").toString(); + if(StringUtils.isNotBlank(viewNodeName) && !"empty".equals(viewNodeName)){ + viewNodeName = getNodeNameByKey(viewNodeName,(String) requestRef.getJobContent().get("json")); + visualisPostAction.addRequestPayload(CSCommonUtils.NODE_NAME_STR, viewNodeName); + } + } + SSOUrlBuilderOperation ssoUrlBuilderOperation = requestRef.getWorkspace().getSSOUrlBuilderOperation().copy(); + ssoUrlBuilderOperation.setAppName(getAppName()); + ssoUrlBuilderOperation.setReqUrl(url); + ssoUrlBuilderOperation.setWorkspace(requestRef.getWorkspace().getWorkspaceName()); + ResponseRef responseRef; + try{ + visualisPostAction.setUrl(ssoUrlBuilderOperation.getBuiltUrl()); + // + HttpResult httpResult = (HttpResult) this.ssoRequestOperation.requestWithSSO(ssoUrlBuilderOperation, visualisPostAction); + responseRef = new VisualisCommonResponseRef(httpResult.getResponseBody()); + } catch (Exception e){ + throw new ExternalOperationFailedException(90177, "Create Widget Exception", e); + } + if(responseRef.isFailed()){ + logger.error(responseRef.getResponseBody()); + throw new ExternalOperationFailedException(90178,responseRef.getErrorMsg()); + } + // cs + VisualisRefUpdateOperation visualisRefUpdateOperation = new VisualisRefUpdateOperation(developmentService); + NodeUpdateCSRequestRefImpl visualisUpdateCSRequestRef = new NodeUpdateCSRequestRefImpl(); + visualisUpdateCSRequestRef.setContextID((String) requestRef.getJobContent().get(CSCommonUtils.CONTEXT_ID_STR)); + visualisUpdateCSRequestRef.setJobContent(responseRef.toMap()); + visualisUpdateCSRequestRef.setUserName(requestRef.getUserName()); + visualisUpdateCSRequestRef.setNodeType(requestRef.getNodeType()); + visualisUpdateCSRequestRef.setWorkspace(requestRef.getWorkspace()); + visualisRefUpdateOperation.updateRef(visualisUpdateCSRequestRef); + return responseRef; + } + + public static String getNodeNameByKey(String key, String json) { + JsonParser parser = new JsonParser(); + JsonObject jsonObject = parser.parse(json).getAsJsonObject(); + JsonArray nodeJsonArray = jsonObject.getAsJsonArray("nodes"); + List dwsNodes = DSSCommonUtils.COMMON_GSON.fromJson(nodeJsonArray, new TypeToken>() { + }.getType()); + return dwsNodes.stream().filter(n -> key.equals(n.getId())).map(DSSNode::getName).findFirst().orElse(""); + } + + + private ResponseRef sendDisplayRequest(NodeRequestRef requestRef) throws ExternalOperationFailedException { + String url = getBaseUrl() + URLUtils.displayUrl; + VisualisPostAction visualisPostAction = new VisualisPostAction(); + visualisPostAction.setUser(requestRef.getUserName()); + visualisPostAction.addRequestPayload("name", requestRef.getName()); + visualisPostAction.addRequestPayload("projectId", requestRef.getParameter("projectId")); + visualisPostAction.addRequestPayload("avatar", "18"); + visualisPostAction.addRequestPayload("publish", true); + SSOUrlBuilderOperation ssoUrlBuilderOperation = requestRef.getWorkspace().getSSOUrlBuilderOperation().copy(); + ssoUrlBuilderOperation.setAppName(getAppName()); + ssoUrlBuilderOperation.setReqUrl(url); + ssoUrlBuilderOperation.setWorkspace(requestRef.getWorkspace().getWorkspaceName()); + VisualisCommonResponseRef responseRef; + try{ + visualisPostAction.setUrl(ssoUrlBuilderOperation.getBuiltUrl()); + HttpResult httpResult = (HttpResult) this.ssoRequestOperation.requestWithSSO(ssoUrlBuilderOperation, visualisPostAction); + responseRef = new VisualisCommonResponseRef(httpResult.getResponseBody()); + } catch (Exception e){ + throw new ExternalOperationFailedException(90177, "Create Display Exception", e); + } + createDisplaySlide(responseRef, requestRef); + return responseRef; + } + + private ResponseRef sendDashboardRequest(NodeRequestRef requestRef) throws ExternalOperationFailedException { + String url = getBaseUrl() + URLUtils.dashboardPortalUrl; + VisualisPostAction visualisPostAction = new VisualisPostAction(); + visualisPostAction.setUser(requestRef.getUserName()); + visualisPostAction.addRequestPayload("name", requestRef.getName()); + visualisPostAction.addRequestPayload("projectId", requestRef.getParameter("projectId")); + visualisPostAction.addRequestPayload("avatar", "18"); + visualisPostAction.addRequestPayload("publish", true); + SSOUrlBuilderOperation ssoUrlBuilderOperation = requestRef.getWorkspace().getSSOUrlBuilderOperation().copy(); + ssoUrlBuilderOperation.setAppName(getAppName()); + ssoUrlBuilderOperation.setReqUrl(url); + ssoUrlBuilderOperation.setWorkspace(requestRef.getWorkspace().getWorkspaceName()); + VisualisCommonResponseRef responseRef; + try{ + visualisPostAction.setUrl(ssoUrlBuilderOperation.getBuiltUrl()); + HttpResult httpResult = (HttpResult) this.ssoRequestOperation.requestWithSSO(ssoUrlBuilderOperation, visualisPostAction); + responseRef = new VisualisCommonResponseRef(httpResult.getResponseBody()); + } catch (Exception e){ + throw new ExternalOperationFailedException(90177, "Create Dashboard Exception", e); + } + createDashboard(responseRef, requestRef); + return responseRef; + } + + private void createDisplaySlide(VisualisCommonResponseRef displayCreateResponseRef, NodeRequestRef requestRef) throws ExternalOperationFailedException { + String url = getBaseUrl() + URLUtils.displayUrl + "/" + displayCreateResponseRef.getDisplayId() + "/slides"; + VisualisPostAction visualisPostAction = new VisualisPostAction(); + visualisPostAction.setUser(requestRef.getUserName()); + visualisPostAction.addRequestPayload("config", URLUtils.displaySlideConfig); + visualisPostAction.addRequestPayload("displayId", Long.parseLong(displayCreateResponseRef.getDisplayId())); + visualisPostAction.addRequestPayload("index", 0); + SSOUrlBuilderOperation ssoUrlBuilderOperation = requestRef.getWorkspace().getSSOUrlBuilderOperation().copy(); + ssoUrlBuilderOperation.setAppName(getAppName()); + ssoUrlBuilderOperation.setReqUrl(url); + ssoUrlBuilderOperation.setWorkspace(requestRef.getWorkspace().getWorkspaceName()); + Map resMap = Maps.newHashMap(); + try{ + visualisPostAction.setUrl(ssoUrlBuilderOperation.getBuiltUrl()); + HttpResult httpResult = (HttpResult) this.ssoRequestOperation.requestWithSSO(ssoUrlBuilderOperation, visualisPostAction); + resMap = BDPJettyServerHelper.jacksonJson().readValue(httpResult.getResponseBody(), Map.class); + } catch (Exception e){ + throw new ExternalOperationFailedException(90177, "Create DisplaySlide Exception", e); + } + Map header = (Map) resMap.get("header"); + int code = (int) header.get("code"); + if (code != 200) { + String errorMsg = header.toString(); + throw new ExternalOperationFailedException(90176, errorMsg, null); + } + } + + private void createDashboard(VisualisCommonResponseRef dashboardCreateResponseRef, NodeRequestRef requestRef) throws ExternalOperationFailedException { + String url = getBaseUrl() + URLUtils.dashboardPortalUrl + "/" + dashboardCreateResponseRef.getDashboardId() + "/dashboards"; + VisualisPostAction visualisPostAction = new VisualisPostAction(); + visualisPostAction.setUser(requestRef.getUserName()); + visualisPostAction.addRequestPayload("config", ""); + visualisPostAction.addRequestPayload("dashboardPortalId", Long.parseLong(dashboardCreateResponseRef.getDashboardId())); + visualisPostAction.addRequestPayload("index", 0); + visualisPostAction.addRequestPayload("name", requestRef.getName()); + visualisPostAction.addRequestPayload("parentId", 0); + visualisPostAction.addRequestPayload("type", 1); + SSOUrlBuilderOperation ssoUrlBuilderOperation = requestRef.getWorkspace().getSSOUrlBuilderOperation().copy(); + ssoUrlBuilderOperation.setAppName(getAppName()); + ssoUrlBuilderOperation.setReqUrl(url); + ssoUrlBuilderOperation.setWorkspace(requestRef.getWorkspace().getWorkspaceName()); + Map resMap = Maps.newHashMap(); + try{ + visualisPostAction.setUrl(ssoUrlBuilderOperation.getBuiltUrl()); + HttpResult httpResult = (HttpResult) this.ssoRequestOperation.requestWithSSO(ssoUrlBuilderOperation, visualisPostAction); + resMap = BDPJettyServerHelper.jacksonJson().readValue(httpResult.getResponseBody(), Map.class); + } catch (Exception e){ + throw new ExternalOperationFailedException(90177, "Create Dashboard Exception", e); + } + Map header = (Map) resMap.get("header"); + int code = (int) header.get("code"); + if (code != 200) { + String errorMsg = header.toString(); + throw new ExternalOperationFailedException(90176, errorMsg, null); + } + } + + @Override + public void setDevelopmentService(DevelopmentService service) { + this.developmentService = service; + } + + private String getBaseUrl(){ + return developmentService.getAppInstance().getBaseUrl(); + } + + +} diff --git a/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/operation/VisualisRefDeletionOperation.java b/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/operation/VisualisRefDeletionOperation.java new file mode 100644 index 000000000..f8789df36 --- /dev/null +++ b/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/operation/VisualisRefDeletionOperation.java @@ -0,0 +1,168 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.visualis.operation; + +import com.google.common.collect.Maps; +import com.webank.wedatasphere.dss.appconn.visualis.VisualisAppConn; +import com.webank.wedatasphere.dss.appconn.visualis.model.VisualisDeleteAction; +import com.webank.wedatasphere.dss.appconn.visualis.utils.URLUtils; +import com.webank.wedatasphere.dss.appconn.visualis.utils.VisualisNodeUtils; +import com.webank.wedatasphere.dss.standard.app.development.service.DevelopmentService; +import com.webank.wedatasphere.dss.standard.app.development.ref.NodeRequestRef; +import com.webank.wedatasphere.dss.standard.app.development.operation.RefDeletionOperation; +import com.webank.wedatasphere.dss.standard.app.sso.builder.SSOUrlBuilderOperation; +import com.webank.wedatasphere.dss.standard.app.sso.request.SSORequestOperation; +import com.webank.wedatasphere.dss.standard.common.entity.ref.RequestRef; +import com.webank.wedatasphere.dss.standard.common.exception.operation.ExternalOperationFailedException; +import com.webank.wedatasphere.linkis.httpclient.response.HttpResult; +import com.webank.wedatasphere.linkis.server.BDPJettyServerHelper; + +import java.util.Map; + +public class VisualisRefDeletionOperation implements RefDeletionOperation { + + private DevelopmentService developmentService; + private SSORequestOperation ssoRequestOperation; + + public VisualisRefDeletionOperation(DevelopmentService service) { + this.developmentService = service; + this.ssoRequestOperation = this.developmentService.getSSORequestService().createSSORequestOperation(getAppName()); + } + + private String getAppName() { + return VisualisAppConn.VISUALIS_APPCONN_NAME; + } + + @Override + public void deleteRef(RequestRef requestRef) throws ExternalOperationFailedException { + NodeRequestRef visualisDeleteRequestRef = (NodeRequestRef) requestRef; + if ("linkis.appconn.visualis.widget".equalsIgnoreCase(visualisDeleteRequestRef.getNodeType())) { + deleteWidget(visualisDeleteRequestRef); + } else if ("linkis.appconn.visualis.display".equalsIgnoreCase(visualisDeleteRequestRef.getNodeType())) { + deleteDisplay(visualisDeleteRequestRef); + } else if ("linkis.appconn.visualis.dashboard".equalsIgnoreCase(visualisDeleteRequestRef.getNodeType())) { + deleteDashboardPortal(visualisDeleteRequestRef); + } else { + throw new ExternalOperationFailedException(90177, "Unknown task type " + visualisDeleteRequestRef.getNodeType(), null); + } + } + + private void deleteWidget(NodeRequestRef visualisDeleteRequestRef) throws ExternalOperationFailedException { + String url = null; + try { + url = getBaseUrl() + URLUtils.widgetDeleteUrl + "/" + VisualisNodeUtils.getId(visualisDeleteRequestRef); + } catch (Exception e) { + throw new ExternalOperationFailedException(90177, "Delete Widget Exception", e); + } + VisualisDeleteAction deleteAction = new VisualisDeleteAction(); + deleteAction.setUser(visualisDeleteRequestRef.getUserName()); + SSOUrlBuilderOperation ssoUrlBuilderOperation = visualisDeleteRequestRef.getWorkspace().getSSOUrlBuilderOperation().copy(); + ssoUrlBuilderOperation.setAppName(getAppName()); + ssoUrlBuilderOperation.setReqUrl(url); + ssoUrlBuilderOperation.setWorkspace(visualisDeleteRequestRef.getWorkspace().getWorkspaceName()); + String response = ""; + Map resMap = Maps.newHashMap(); + HttpResult httpResult = null; + try { + deleteAction.setUrl(ssoUrlBuilderOperation.getBuiltUrl()); + httpResult = (HttpResult) this.ssoRequestOperation.requestWithSSO(ssoUrlBuilderOperation, deleteAction); + response = httpResult.getResponseBody(); + resMap = BDPJettyServerHelper.jacksonJson().readValue(response, Map.class); + } catch (Exception e) { + throw new ExternalOperationFailedException(90177, "Delete Widget Exception", e); + } + Map header = (Map) resMap.get("header"); + int code = (int) header.get("code"); + if (code != 200) { + String errorMsg = header.toString(); + throw new ExternalOperationFailedException(90177, errorMsg, null); + } + } + + private void deleteDisplay(NodeRequestRef visualisDeleteRequestRef) throws ExternalOperationFailedException { + String url = null; + try { + url = getBaseUrl() + URLUtils.displayUrl + "/" + VisualisNodeUtils.getId(visualisDeleteRequestRef); + } catch (Exception e) { + new ExternalOperationFailedException(90177, "Delete Display Exception", e); + } + VisualisDeleteAction deleteAction = new VisualisDeleteAction(); + deleteAction.setUser(visualisDeleteRequestRef.getUserName()); + SSOUrlBuilderOperation ssoUrlBuilderOperation = visualisDeleteRequestRef.getWorkspace().getSSOUrlBuilderOperation().copy(); + ssoUrlBuilderOperation.setAppName(getAppName()); + ssoUrlBuilderOperation.setReqUrl(url); + ssoUrlBuilderOperation.setWorkspace(visualisDeleteRequestRef.getWorkspace().getWorkspaceName()); + String response = ""; + Map resMap = Maps.newHashMap(); + HttpResult httpResult = null; + try { + deleteAction.setUrl(ssoUrlBuilderOperation.getBuiltUrl()); + httpResult = (HttpResult) this.ssoRequestOperation.requestWithSSO(ssoUrlBuilderOperation, deleteAction); + response = httpResult.getResponseBody(); + resMap = BDPJettyServerHelper.jacksonJson().readValue(response, Map.class); + } catch (Exception e) { + throw new ExternalOperationFailedException(90177, "Delete Display Exception", e); + } + Map header = (Map) resMap.get("header"); + int code = (int) header.get("code"); + if (code != 200) { + String errorMsg = header.toString(); + throw new ExternalOperationFailedException(90177, errorMsg, null); + } + } + + private void deleteDashboardPortal(NodeRequestRef visualisDeleteRequestRef) throws ExternalOperationFailedException { + String url = null; + try { + url = getBaseUrl() + URLUtils.dashboardPortalUrl + "/" + VisualisNodeUtils.getId(visualisDeleteRequestRef); + } catch (Exception e) { + new ExternalOperationFailedException(90177, "Delete Dashboard Exception", e); + } + VisualisDeleteAction deleteAction = new VisualisDeleteAction(); + deleteAction.setUser(visualisDeleteRequestRef.getUserName()); + SSOUrlBuilderOperation ssoUrlBuilderOperation = visualisDeleteRequestRef.getWorkspace().getSSOUrlBuilderOperation().copy(); + ssoUrlBuilderOperation.setAppName(getAppName()); + ssoUrlBuilderOperation.setReqUrl(url); + ssoUrlBuilderOperation.setWorkspace(visualisDeleteRequestRef.getWorkspace().getWorkspaceName()); + String response = ""; + Map resMap = Maps.newHashMap(); + HttpResult httpResult = null; + try { + deleteAction.setUrl(ssoUrlBuilderOperation.getBuiltUrl()); + httpResult = (HttpResult) this.ssoRequestOperation.requestWithSSO(ssoUrlBuilderOperation, deleteAction); + response = httpResult.getResponseBody(); + resMap = BDPJettyServerHelper.jacksonJson().readValue(response, Map.class); + } catch (Exception e) { + throw new ExternalOperationFailedException(90177, "Delete Display Exception", e); + } + Map header = (Map) resMap.get("header"); + int code = (int) header.get("code"); + if (code != 200) { + String errorMsg = header.toString(); + throw new ExternalOperationFailedException(90177, errorMsg, null); + } + } + + @Override + public void setDevelopmentService(DevelopmentService service) { + this.developmentService = service; + } + + private String getBaseUrl() { + return developmentService.getAppInstance().getBaseUrl(); + } +} diff --git a/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/operation/VisualisRefExportOperation.java b/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/operation/VisualisRefExportOperation.java new file mode 100644 index 000000000..5ded49d2f --- /dev/null +++ b/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/operation/VisualisRefExportOperation.java @@ -0,0 +1,103 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.visualis.operation; + +import com.webank.wedatasphere.dss.appconn.visualis.VisualisAppConn; +import com.webank.wedatasphere.dss.appconn.visualis.model.VisualisPostAction; +import com.webank.wedatasphere.dss.appconn.visualis.publish.VisualisExportResponseRef; +import com.webank.wedatasphere.dss.appconn.visualis.ref.VisualisCommonResponseRef; +import com.webank.wedatasphere.dss.appconn.visualis.utils.URLUtils; +import com.webank.wedatasphere.dss.standard.app.development.service.DevelopmentService; +import com.webank.wedatasphere.dss.standard.app.development.ref.ExportRequestRef; +import com.webank.wedatasphere.dss.standard.app.development.operation.RefExportOperation; +import com.webank.wedatasphere.dss.standard.app.sso.builder.SSOUrlBuilderOperation; +import com.webank.wedatasphere.dss.standard.app.sso.request.SSORequestOperation; +import com.webank.wedatasphere.dss.standard.common.entity.ref.ResponseRef; +import com.webank.wedatasphere.dss.standard.common.exception.operation.ExternalOperationFailedException; +import com.webank.wedatasphere.linkis.httpclient.request.HttpAction; +import com.webank.wedatasphere.linkis.httpclient.response.HttpResult; +import com.webank.wedatasphere.linkis.server.BDPJettyServerHelper; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; + +public class VisualisRefExportOperation implements RefExportOperation { + + private final static Logger logger = LoggerFactory.getLogger(VisualisRefExportOperation.class); + + DevelopmentService developmentService; + private SSORequestOperation ssoRequestOperation; + + public VisualisRefExportOperation(DevelopmentService developmentService){ + this.developmentService = developmentService; + this.ssoRequestOperation = this.developmentService.getSSORequestService().createSSORequestOperation(getAppName()); + } + + private String getAppName() { + return VisualisAppConn.VISUALIS_APPCONN_NAME; + } + + @Override + public ResponseRef exportRef(ExportRequestRef requestRef) throws ExternalOperationFailedException { + String url = getBaseUrl() + URLUtils.projectUrl + "/export"; + VisualisPostAction visualisPostAction = new VisualisPostAction(); + visualisPostAction.setUser(requestRef.getParameter("user").toString()); + visualisPostAction.addRequestPayload("projectId", requestRef.getParameter("projectId")); + visualisPostAction.addRequestPayload("partial", true); + String nodeType = requestRef.getParameter("nodeType").toString(); + String externalContent = null; + try { + externalContent = BDPJettyServerHelper.jacksonJson().writeValueAsString(requestRef.getParameter("jobContent")); + if("linkis.appconn.visualis.widget".equalsIgnoreCase(nodeType)){ + VisualisCommonResponseRef widgetCreateResponseRef = new VisualisCommonResponseRef(externalContent); + visualisPostAction.addRequestPayload("widgetIds", ((Double) Double.parseDouble(widgetCreateResponseRef.getWidgetId())).longValue()); + } else if("linkis.appconn.visualis.display".equalsIgnoreCase(nodeType)){ + VisualisCommonResponseRef displayCreateResponseRef = new VisualisCommonResponseRef(externalContent); + visualisPostAction.addRequestPayload("displayIds", ((Double) Double.parseDouble(displayCreateResponseRef.getDisplayId())).longValue()); + } else if("linkis.appconn.visualis.dashboard".equalsIgnoreCase(nodeType)){ + VisualisCommonResponseRef dashboardCreateResponseRef = new VisualisCommonResponseRef(externalContent); + visualisPostAction.addRequestPayload("dashboardPortalIds", ((Double) Double.parseDouble(dashboardCreateResponseRef.getDashboardId())).longValue()); + } else { + throw new ExternalOperationFailedException(90177, "Unknown task type " + requestRef.getType(), null); + } + } catch (Exception e) { + logger.error("Failed to create export request", e); + } + SSOUrlBuilderOperation ssoUrlBuilderOperation = requestRef.getWorkspace().getSSOUrlBuilderOperation().copy(); + ssoUrlBuilderOperation.setAppName(getAppName()); + ssoUrlBuilderOperation.setReqUrl(url); + ssoUrlBuilderOperation.setWorkspace(requestRef.getWorkspace().getWorkspaceName()); + ResponseRef responseRef; + try{ + visualisPostAction.setUrl(ssoUrlBuilderOperation.getBuiltUrl()); + HttpResult httpResult = this.ssoRequestOperation.requestWithSSO(ssoUrlBuilderOperation, visualisPostAction); + responseRef = new VisualisExportResponseRef(httpResult.getResponseBody()); + } catch (Exception e){ + throw new ExternalOperationFailedException(90176, "Export Visualis Exception", e); + } + return responseRef; + } + + @Override + public void setDevelopmentService(DevelopmentService service) { + developmentService = service; + } + + private String getBaseUrl(){ + return developmentService.getAppInstance().getBaseUrl(); + } + +} diff --git a/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/operation/VisualisRefImportOperation.java b/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/operation/VisualisRefImportOperation.java new file mode 100644 index 000000000..2901bd375 --- /dev/null +++ b/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/operation/VisualisRefImportOperation.java @@ -0,0 +1,87 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.visualis.operation; + +import com.webank.wedatasphere.dss.appconn.visualis.VisualisAppConn; +import com.webank.wedatasphere.dss.appconn.visualis.model.VisualisPostAction; +import com.webank.wedatasphere.dss.appconn.visualis.publish.VisualisImportResponseRef; +import com.webank.wedatasphere.dss.appconn.visualis.utils.URLUtils; +import com.webank.wedatasphere.dss.standard.app.development.service.DevelopmentService; +import com.webank.wedatasphere.dss.standard.app.development.ref.ImportRequestRef; +import com.webank.wedatasphere.dss.standard.app.development.operation.RefImportOperation; +import com.webank.wedatasphere.dss.standard.app.sso.builder.SSOUrlBuilderOperation; +import com.webank.wedatasphere.dss.standard.app.sso.request.SSORequestOperation; +import com.webank.wedatasphere.dss.standard.common.entity.ref.ResponseRef; +import com.webank.wedatasphere.dss.standard.common.exception.operation.ExternalOperationFailedException; +import com.webank.wedatasphere.linkis.httpclient.request.HttpAction; +import com.webank.wedatasphere.linkis.httpclient.response.HttpResult; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; + +import java.util.Map; + +public class VisualisRefImportOperation implements RefImportOperation { + + private final static Logger logger = LoggerFactory.getLogger(VisualisRefImportOperation.class); + + DevelopmentService developmentService; + private SSORequestOperation ssoRequestOperation; + + public VisualisRefImportOperation(DevelopmentService developmentService){ + this.developmentService = developmentService; + this.ssoRequestOperation = this.developmentService.getSSORequestService().createSSORequestOperation(getAppName()); + } + + private String getAppName() { + return VisualisAppConn.VISUALIS_APPCONN_NAME; + } + + @Override + public ResponseRef importRef(ImportRequestRef requestRef) throws ExternalOperationFailedException { + String url = getBaseUrl() + URLUtils.projectUrl + "/import"; + VisualisPostAction visualisPostAction = new VisualisPostAction(); + visualisPostAction.setUser(requestRef.getParameter("user").toString()); + visualisPostAction.addRequestPayload("projectId", requestRef.getParameter("projectId")); + visualisPostAction.addRequestPayload("projectVersion", "v1"); + visualisPostAction.addRequestPayload("flowVersion", requestRef.getParameter("orcVersion")); + visualisPostAction.addRequestPayload("resourceId", requestRef.getParameter("resourceId")); + visualisPostAction.addRequestPayload("version", requestRef.getParameter("version")); + SSOUrlBuilderOperation ssoUrlBuilderOperation = requestRef.getWorkspace().getSSOUrlBuilderOperation().copy(); + ssoUrlBuilderOperation.setAppName(getAppName()); + ssoUrlBuilderOperation.setReqUrl(url); + ssoUrlBuilderOperation.setWorkspace(requestRef.getWorkspace().getWorkspaceName()); + ResponseRef responseRef; + try{ + visualisPostAction.setUrl(ssoUrlBuilderOperation.getBuiltUrl()); + HttpResult httpResult = this.ssoRequestOperation.requestWithSSO(ssoUrlBuilderOperation, visualisPostAction); + responseRef = new VisualisImportResponseRef((Map) requestRef.getParameter("jobContent"), httpResult.getResponseBody(), requestRef.getParameter("nodeType").toString(), requestRef.getParameter("projectId")); + } catch (Exception e){ + throw new ExternalOperationFailedException(90176, "Export Visualis Exception", e); + } + return responseRef; + } + + @Override + public void setDevelopmentService(DevelopmentService service) { + this.developmentService = service; + } + + private String getBaseUrl(){ + return developmentService.getAppInstance().getBaseUrl(); + } + +} diff --git a/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/operation/VisualisRefQueryOperation.java b/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/operation/VisualisRefQueryOperation.java new file mode 100644 index 000000000..3e028d9d9 --- /dev/null +++ b/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/operation/VisualisRefQueryOperation.java @@ -0,0 +1,77 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.visualis.operation; + +import com.webank.wedatasphere.dss.appconn.visualis.ref.VisualisCommonResponseRef; +import com.webank.wedatasphere.dss.appconn.visualis.ref.VisualisOpenRequestRef; +import com.webank.wedatasphere.dss.appconn.visualis.ref.VisualisOpenResponseRef; +import com.webank.wedatasphere.dss.appconn.visualis.utils.URLUtils; +import com.webank.wedatasphere.dss.common.utils.DSSCommonUtils; +import com.webank.wedatasphere.dss.standard.app.development.operation.RefQueryOperation; +import com.webank.wedatasphere.dss.standard.app.development.ref.OpenRequestRef; +import com.webank.wedatasphere.dss.standard.app.development.service.DevelopmentService; +import com.webank.wedatasphere.dss.standard.common.entity.ref.ResponseRef; +import com.webank.wedatasphere.dss.standard.common.exception.operation.ExternalOperationFailedException; +import com.webank.wedatasphere.linkis.server.BDPJettyServerHelper; + +import java.util.HashMap; +import java.util.Map; + +public class VisualisRefQueryOperation implements RefQueryOperation { + + DevelopmentService developmentService; + + @Override + public ResponseRef query(OpenRequestRef ref) throws ExternalOperationFailedException { + VisualisOpenRequestRef visualisOpenRequestRef = (VisualisOpenRequestRef) ref; + try { + String externalContent = BDPJettyServerHelper.jacksonJson().writeValueAsString(visualisOpenRequestRef.getJobContent()); + Long projectId = (Long) visualisOpenRequestRef.getParameter("projectId"); + String baseUrl = visualisOpenRequestRef.getParameter("redirectUrl").toString(); + String jumpUrl = baseUrl; + if("linkis.appconn.visualis.widget".equalsIgnoreCase(visualisOpenRequestRef.getType())){ + VisualisCommonResponseRef widgetCreateResponseRef = new VisualisCommonResponseRef(externalContent); + jumpUrl = URLUtils.getUrl(baseUrl, URLUtils.WIDGET_JUMP_URL_FORMAT, projectId.toString(), widgetCreateResponseRef.getWidgetId()); + } else if("linkis.appconn.visualis.display".equalsIgnoreCase(visualisOpenRequestRef.getType())){ + VisualisCommonResponseRef displayCreateResponseRef = new VisualisCommonResponseRef(externalContent); + jumpUrl = URLUtils.getUrl(baseUrl, URLUtils.DISPLAY_JUMP_URL_FORMAT, projectId.toString(), displayCreateResponseRef.getDisplayId()); + }else if("linkis.appconn.visualis.dashboard".equalsIgnoreCase(visualisOpenRequestRef.getType())){ + VisualisCommonResponseRef dashboardCreateResponseRef = new VisualisCommonResponseRef(externalContent); + jumpUrl = URLUtils.getUrl(baseUrl, URLUtils.DASHBOARD_JUMP_URL_FORMAT, projectId.toString(), dashboardCreateResponseRef.getDashboardId(), visualisOpenRequestRef.getName()); + } else { + throw new ExternalOperationFailedException(90177, "Unknown task type " + visualisOpenRequestRef.getType(), null); + } + String retJumpUrl = getEnvUrl(jumpUrl, visualisOpenRequestRef); + Map retMap = new HashMap<>(); + retMap.put("jumpUrl",retJumpUrl); + return new VisualisOpenResponseRef(DSSCommonUtils.COMMON_GSON.toJson(retMap), 0); + } catch (Exception e) { + throw new ExternalOperationFailedException(90177, "Failed to parse jobContent ", e); + } + } + + public String getEnvUrl(String url, VisualisOpenRequestRef visualisOpenRequestRef ){ + String env = ((Map) visualisOpenRequestRef.getParameter("params")).get(DSSCommonUtils.DSS_LABELS_KEY).toString(); + return url + "?env=" + env.toLowerCase(); + } + + + @Override + public void setDevelopmentService(DevelopmentService service) { + this.developmentService = service; + } +} diff --git a/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/operation/VisualisRefUpdateOperation.java b/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/operation/VisualisRefUpdateOperation.java new file mode 100644 index 000000000..113464bd8 --- /dev/null +++ b/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/operation/VisualisRefUpdateOperation.java @@ -0,0 +1,234 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.visualis.operation; + +import com.google.common.collect.Lists; +import com.google.common.collect.Maps; +import com.webank.wedatasphere.dss.appconn.visualis.VisualisAppConn; +import com.webank.wedatasphere.dss.appconn.visualis.model.VisualisPostAction; +import com.webank.wedatasphere.dss.appconn.visualis.model.VisualisPutAction; +import com.webank.wedatasphere.dss.appconn.visualis.utils.URLUtils; +import com.webank.wedatasphere.dss.appconn.visualis.utils.VisualisNodeUtils; +import com.webank.wedatasphere.dss.standard.app.development.ref.UpdateCSRequestRef; +import com.webank.wedatasphere.dss.standard.app.development.service.DevelopmentService; +import com.webank.wedatasphere.dss.standard.app.development.ref.NodeRequestRef; +import com.webank.wedatasphere.dss.standard.app.development.operation.RefUpdateOperation; +import com.webank.wedatasphere.dss.standard.app.development.ref.UpdateRequestRef; +import com.webank.wedatasphere.dss.standard.app.development.ref.CommonResponseRef; +import com.webank.wedatasphere.dss.standard.app.sso.builder.SSOUrlBuilderOperation; +import com.webank.wedatasphere.dss.standard.app.sso.request.SSORequestOperation; +import com.webank.wedatasphere.dss.standard.common.entity.ref.ResponseRef; +import com.webank.wedatasphere.dss.standard.common.exception.operation.ExternalOperationFailedException; +import com.webank.wedatasphere.linkis.cs.common.utils.CSCommonUtils; +import com.webank.wedatasphere.linkis.httpclient.request.HttpAction; +import com.webank.wedatasphere.linkis.httpclient.response.HttpResult; +import com.webank.wedatasphere.linkis.server.BDPJettyServerHelper; + +import java.util.Map; + +public class VisualisRefUpdateOperation implements RefUpdateOperation { + + DevelopmentService developmentService; + private SSORequestOperation ssoRequestOperation; + + public VisualisRefUpdateOperation(DevelopmentService developmentService) { + this.developmentService = developmentService; + + this.ssoRequestOperation = developmentService.getSSORequestService().createSSORequestOperation(getAppName()); + } + + private String getAppName() { + return VisualisAppConn.VISUALIS_APPCONN_NAME; + } + + @Override + public ResponseRef updateRef(UpdateRequestRef requestRef) throws ExternalOperationFailedException { + if (!(requestRef instanceof UpdateCSRequestRef)) { + NodeRequestRef visualisUpdateRequestRef = (NodeRequestRef) requestRef; + if ("linkis.appconn.visualis.widget".equalsIgnoreCase(visualisUpdateRequestRef.getNodeType())) { + return updateWidget(visualisUpdateRequestRef); + } else if ("linkis.appconn.visualis.display".equalsIgnoreCase(visualisUpdateRequestRef.getNodeType())) { + return updateDisplay(visualisUpdateRequestRef); + } else if ("linkis.appconn.visualis.dashboard".equalsIgnoreCase(visualisUpdateRequestRef.getNodeType())) { + return updateDashboardPortal(visualisUpdateRequestRef); + } else { + throw new ExternalOperationFailedException(90177, "Unknown task type " + visualisUpdateRequestRef.getNodeType(), null); + } + } else { + NodeRequestRef visualisUpdateCSRequestRef = (NodeRequestRef) requestRef; + if ("linkis.appconn.visualis.widget".equalsIgnoreCase(visualisUpdateCSRequestRef.getNodeType())) { + return updateWidgetCS(visualisUpdateCSRequestRef); + } else { + throw new ExternalOperationFailedException(90177, "Unknown task type " + visualisUpdateCSRequestRef.getNodeType(), null); + } + } + + } + + private ResponseRef updateWidgetCS(NodeRequestRef visualisUpdateCSRequestRef) throws ExternalOperationFailedException { + String url = getBaseUrl() + URLUtils.widgetContextUrl; + VisualisPostAction postAction = new VisualisPostAction(); + try { + postAction.addRequestPayload("id", Integer.parseInt(VisualisNodeUtils.getId(visualisUpdateCSRequestRef))); + postAction.addRequestPayload(CSCommonUtils.CONTEXT_ID_STR, visualisUpdateCSRequestRef.getContextID()); + postAction.setUser(visualisUpdateCSRequestRef.getUserName()); + SSOUrlBuilderOperation ssoUrlBuilderOperation = visualisUpdateCSRequestRef.getWorkspace().getSSOUrlBuilderOperation().copy(); + ssoUrlBuilderOperation.setAppName(getAppName()); + ssoUrlBuilderOperation.setReqUrl(url); + ssoUrlBuilderOperation.setWorkspace(visualisUpdateCSRequestRef.getWorkspace().getWorkspaceName()); + postAction.setUrl(ssoUrlBuilderOperation.getBuiltUrl()); + HttpResult httpResult = this.ssoRequestOperation.requestWithSSO(ssoUrlBuilderOperation, postAction); + String response = httpResult.getResponseBody(); + Map resMap = BDPJettyServerHelper.jacksonJson().readValue(response, Map.class); + int status = (int) resMap.get("status"); + if (status != 0) { + String errorMsg = resMap.get("message").toString(); + throw new ExternalOperationFailedException(90177, errorMsg); + } + return new CommonResponseRef(); + } catch (Exception e) { + throw new ExternalOperationFailedException(90177, "Update CS Exception", e); + } + } + + private ResponseRef updateDashboardPortal(NodeRequestRef visualisUpdateRequestRef) throws ExternalOperationFailedException { + String url = null; + String id = null; + try { + id = VisualisNodeUtils.getId(visualisUpdateRequestRef); + url = getBaseUrl() + URLUtils.dashboardPortalUrl + "/" + id; + } catch (Exception e) { + new ExternalOperationFailedException(90177, "Update Dashboard Exception", e); + } + VisualisPutAction putAction = new VisualisPutAction(); + putAction.addRequestPayload("projectId", visualisUpdateRequestRef.getProjectId()); + putAction.addRequestPayload("name", visualisUpdateRequestRef.getName()); + putAction.addRequestPayload("id", Long.parseLong(id)); + putAction.addRequestPayload("avatar", "9"); + putAction.addRequestPayload("description", ""); + putAction.addRequestPayload("publish", true); + putAction.addRequestPayload("roleIds", Lists.newArrayList()); + putAction.setUser(visualisUpdateRequestRef.getUserName()); + SSOUrlBuilderOperation ssoUrlBuilderOperation = visualisUpdateRequestRef.getWorkspace().getSSOUrlBuilderOperation().copy(); + ssoUrlBuilderOperation.setAppName(getAppName()); + ssoUrlBuilderOperation.setReqUrl(url); + ssoUrlBuilderOperation.setWorkspace(visualisUpdateRequestRef.getWorkspace().getWorkspaceName()); + String response = ""; + Map resMap = Maps.newHashMap(); + HttpResult httpResult = null; + try { + putAction.setUrl(ssoUrlBuilderOperation.getBuiltUrl()); + httpResult = this.ssoRequestOperation.requestWithSSO(ssoUrlBuilderOperation, putAction); + response = httpResult.getResponseBody(); + resMap = BDPJettyServerHelper.jacksonJson().readValue(response, Map.class); + } catch (Exception e) { + throw new ExternalOperationFailedException(90177, "Update Dashboard Exception", e); + } + Map header = (Map) resMap.get("header"); + int code = (int) header.get("code"); + if (code != 200) { + String errorMsg = header.toString(); + throw new ExternalOperationFailedException(90177, errorMsg, null); + } + return new CommonResponseRef(); + } + + private ResponseRef updateDisplay(NodeRequestRef visualisUpdateRequestRef) throws ExternalOperationFailedException { + String url = null; + String id = null; + try { + id = VisualisNodeUtils.getId(visualisUpdateRequestRef); + url = getBaseUrl() + URLUtils.displayUrl + "/" + id; + } catch (Exception e) { + new ExternalOperationFailedException(90177, "Update Display Exception", e); + } + + VisualisPutAction putAction = new VisualisPutAction(); + putAction.addRequestPayload("projectId", visualisUpdateRequestRef.getProjectId()); + putAction.addRequestPayload("name", visualisUpdateRequestRef.getName()); + putAction.addRequestPayload("id", Long.parseLong(id)); + putAction.addRequestPayload("avatar", "9"); + putAction.addRequestPayload("description", ""); + putAction.addRequestPayload("publish", true); + putAction.addRequestPayload("roleIds", Lists.newArrayList()); + putAction.setUser(visualisUpdateRequestRef.getUserName()); + SSOUrlBuilderOperation ssoUrlBuilderOperation = visualisUpdateRequestRef.getWorkspace().getSSOUrlBuilderOperation().copy(); + ssoUrlBuilderOperation.setAppName(getAppName()); + ssoUrlBuilderOperation.setReqUrl(url); + ssoUrlBuilderOperation.setWorkspace(visualisUpdateRequestRef.getWorkspace().getWorkspaceName()); + String response = ""; + Map resMap = Maps.newHashMap(); + HttpResult httpResult = null; + try { + putAction.setUrl(ssoUrlBuilderOperation.getBuiltUrl()); + httpResult = this.ssoRequestOperation.requestWithSSO(ssoUrlBuilderOperation, putAction); + response = httpResult.getResponseBody(); + resMap = BDPJettyServerHelper.jacksonJson().readValue(response, Map.class); + } catch (Exception e) { + throw new ExternalOperationFailedException(90177, "Update Display Exception", e); + } + Map header = (Map) resMap.get("header"); + int code = (int) header.get("code"); + if (code != 200) { + String errorMsg = header.toString(); + throw new ExternalOperationFailedException(90177, errorMsg, null); + } + return new CommonResponseRef(); + } + + private ResponseRef updateWidget(NodeRequestRef visualisUpdateRequestRef) throws ExternalOperationFailedException { + String url = getBaseUrl() + URLUtils.widgetUpdateUrl; + VisualisPostAction postAction = new VisualisPostAction(); + try { + postAction.addRequestPayload("id", Long.parseLong(VisualisNodeUtils.getId(visualisUpdateRequestRef))); + } catch (Exception e) { + throw new ExternalOperationFailedException(90177, "Update Widget Exception", e); + } + postAction.addRequestPayload("name", visualisUpdateRequestRef.getName()); + postAction.setUser(visualisUpdateRequestRef.getUserName()); + SSOUrlBuilderOperation ssoUrlBuilderOperation = visualisUpdateRequestRef.getWorkspace().getSSOUrlBuilderOperation().copy(); + ssoUrlBuilderOperation.setAppName(getAppName()); + ssoUrlBuilderOperation.setReqUrl(url); + ssoUrlBuilderOperation.setWorkspace(visualisUpdateRequestRef.getWorkspace().getWorkspaceName()); + String response = ""; + Map resMap = Maps.newHashMap(); + HttpResult httpResult = null; + try { + postAction.setUrl(ssoUrlBuilderOperation.getBuiltUrl()); + httpResult = this.ssoRequestOperation.requestWithSSO(ssoUrlBuilderOperation, postAction); + response = httpResult.getResponseBody(); + resMap = BDPJettyServerHelper.jacksonJson().readValue(response, Map.class); + } catch (Exception e) { + throw new ExternalOperationFailedException(90177, "Update Widget Exception", e); + } + int status = (int) resMap.get("status"); + if (status != 0) { + String errorMsg = resMap.get("message").toString(); + throw new ExternalOperationFailedException(90177, errorMsg); + } + return new CommonResponseRef(); + } + + @Override + public void setDevelopmentService(DevelopmentService service) { + this.developmentService = service; + } + + private String getBaseUrl() { + return developmentService.getAppInstance().getBaseUrl(); + } +} diff --git a/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/project/VisualisProjectCreationOperation.java b/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/project/VisualisProjectCreationOperation.java new file mode 100644 index 000000000..1b7166ffe --- /dev/null +++ b/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/project/VisualisProjectCreationOperation.java @@ -0,0 +1,109 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.visualis.project; + +import com.google.common.collect.Maps; +import com.webank.wedatasphere.dss.appconn.visualis.VisualisAppConn; +import com.webank.wedatasphere.dss.appconn.visualis.model.VisualisPostAction; +import com.webank.wedatasphere.dss.standard.app.sso.builder.SSOUrlBuilderOperation; +import com.webank.wedatasphere.dss.standard.app.sso.request.SSORequestOperation; +import com.webank.wedatasphere.dss.standard.app.structure.StructureService; +import com.webank.wedatasphere.dss.standard.app.structure.project.ProjectCreationOperation; +import com.webank.wedatasphere.dss.standard.app.structure.project.ProjectRequestRef; +import com.webank.wedatasphere.dss.standard.app.structure.project.ProjectResponseRef; +import com.webank.wedatasphere.dss.standard.common.exception.operation.ExternalOperationFailedException; +import com.webank.wedatasphere.linkis.httpclient.request.HttpAction; +import com.webank.wedatasphere.linkis.httpclient.response.HttpResult; +import com.webank.wedatasphere.linkis.server.BDPJettyServerHelper; +import com.webank.wedatasphere.linkis.server.conf.ServerConfiguration; +import java.util.Map; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; + +public class VisualisProjectCreationOperation implements ProjectCreationOperation { + + private static Logger logger = LoggerFactory.getLogger(VisualisProjectCreationOperation.class); + private final static String projectUrl = "/api/rest_s/" + ServerConfiguration.BDP_SERVER_VERSION() + "/visualis/projects"; + private SSORequestOperation ssoRequestOperation; + private StructureService structureService; + + public VisualisProjectCreationOperation(StructureService service, SSORequestOperation ssoRequestOperation) { + this.structureService = service; + this.ssoRequestOperation = ssoRequestOperation; + } + private String getAppName() { + return VisualisAppConn.VISUALIS_APPCONN_NAME; + } + + @Override + public void init() { + } + + @Override + public ProjectResponseRef createProject(ProjectRequestRef projectRef) throws ExternalOperationFailedException { + String url = getBaseUrl() + projectUrl; + VisualisPostAction visualisPostAction = new VisualisPostAction(); + visualisPostAction.setUser(projectRef.getCreateBy()); + visualisPostAction.addRequestPayload("name", projectRef.getName()); + visualisPostAction.addRequestPayload("description", projectRef.getDescription()); + visualisPostAction.addRequestPayload("pic", "6"); + visualisPostAction.addRequestPayload("visibility", true); + SSOUrlBuilderOperation ssoUrlBuilderOperation = projectRef.getWorkspace().getSSOUrlBuilderOperation().copy(); + ssoUrlBuilderOperation.setAppName(getAppName()); + ssoUrlBuilderOperation.setReqUrl(url); + ssoUrlBuilderOperation.setWorkspace(projectRef.getWorkspace().getWorkspaceName()); + String response = ""; + Map resMap = Maps.newHashMap(); + HttpResult httpResult = null; + try { + visualisPostAction.setUrl(ssoUrlBuilderOperation.getBuiltUrl()); + httpResult = this.ssoRequestOperation.requestWithSSO(ssoUrlBuilderOperation, visualisPostAction); + response = httpResult.getResponseBody(); + resMap = BDPJettyServerHelper.jacksonJson().readValue(response, Map.class); + } catch (Exception e) { + logger.error("Create Visualis Project Exception", e); + throw new ExternalOperationFailedException(90176, "Create Visualis Project Exception", e); + } + Map header = (Map) resMap.get("header"); + int code = (int) header.get("code"); + String errorMsg = ""; + if (code != 200) { + errorMsg = header.toString(); + throw new ExternalOperationFailedException(90176, errorMsg, null); + } + Integer projectId = (Integer) ((Map) resMap.get("payload")).get("id"); + VisualisProjectResponseRef visualisProjectResponseRef = null; + try { + visualisProjectResponseRef = new VisualisProjectResponseRef(response, code); + } catch (Exception e) { + throw new ExternalOperationFailedException(90176, "failed to parse response json", e); + } + visualisProjectResponseRef.setAppInstance(structureService.getAppInstance()); + visualisProjectResponseRef.setProjectRefId(projectId.longValue()); + visualisProjectResponseRef.setErrorMsg(errorMsg); + return visualisProjectResponseRef; + } + + @Override + public void setStructureService(StructureService service) { + this.structureService = service; + } + + private String getBaseUrl(){ + return structureService.getAppInstance().getBaseUrl(); + } +} diff --git a/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/project/VisualisProjectResponseRef.java b/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/project/VisualisProjectResponseRef.java new file mode 100644 index 000000000..27c761b5d --- /dev/null +++ b/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/project/VisualisProjectResponseRef.java @@ -0,0 +1,77 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.visualis.project; + +import com.google.common.collect.Maps; +import com.webank.wedatasphere.dss.standard.app.structure.project.ProjectResponseRef; +import com.webank.wedatasphere.dss.standard.common.desc.AppInstance; +import com.webank.wedatasphere.dss.standard.common.entity.ref.AbstractResponseRef; +import com.webank.wedatasphere.linkis.server.BDPJettyServerHelper; +import java.util.Map; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; + +public class VisualisProjectResponseRef extends AbstractResponseRef implements ProjectResponseRef { + + private static final Logger LOGGER = LoggerFactory.getLogger(VisualisProjectResponseRef.class); + private Long projectRefId; + private AppInstance appInstance; + private String errorMsg; + + protected VisualisProjectResponseRef(String responseBody, int status) throws Exception { + super(responseBody, status); + responseMap = BDPJettyServerHelper.jacksonJson().readValue(responseBody, Map.class); + } + + @Override + public Long getProjectRefId() { + return projectRefId; + } + + @Override + public Map getProjectRefIds() { + Map projectRefIdsMap = Maps.newHashMap(); + projectRefIdsMap.put(appInstance, projectRefId); + return projectRefIdsMap; + } + + @Override + public Map toMap() { + return responseMap; + } + + @Override + public String getErrorMsg() { + return errorMsg; + } + + public void setProjectRefId(Long projectRefId) { + this.projectRefId = projectRefId; + } + + public AppInstance getAppInstance() { + return appInstance; + } + + public void setAppInstance(AppInstance appInstance) { + this.appInstance = appInstance; + } + + public void setErrorMsg(String errorMsg) { + this.errorMsg = errorMsg; + } +} diff --git a/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/project/VisualisProjectService.java b/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/project/VisualisProjectService.java new file mode 100644 index 000000000..09372ed34 --- /dev/null +++ b/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/project/VisualisProjectService.java @@ -0,0 +1,63 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.visualis.project; + +import com.webank.wedatasphere.dss.appconn.visualis.VisualisAppConn; +import com.webank.wedatasphere.dss.standard.app.sso.request.SSORequestOperation; +import com.webank.wedatasphere.dss.standard.app.structure.project.ProjectDeletionOperation; +import com.webank.wedatasphere.dss.standard.app.structure.project.ProjectService; +import com.webank.wedatasphere.dss.standard.app.structure.project.ProjectUpdateOperation; +import com.webank.wedatasphere.dss.standard.app.structure.project.ProjectUrlOperation; +import com.webank.wedatasphere.linkis.httpclient.request.HttpAction; +import com.webank.wedatasphere.linkis.httpclient.response.HttpResult; + +public class VisualisProjectService extends ProjectService { + + @Override + public boolean isCooperationSupported() { + return true; + } + + @Override + public boolean isProjectNameUnique() { + return false; + } + + @Override + protected VisualisProjectCreationOperation createProjectCreationOperation() { + SSORequestOperation ssoRequestOperation = getSSORequestService().createSSORequestOperation(VisualisAppConn.VISUALIS_APPCONN_NAME); + VisualisProjectCreationOperation visualisProjectCreationOperation = new VisualisProjectCreationOperation(this, ssoRequestOperation); + visualisProjectCreationOperation.setStructureService(this); + return visualisProjectCreationOperation; + } + + @Override + protected ProjectUpdateOperation createProjectUpdateOperation() { + return null; + } + + @Override + protected ProjectDeletionOperation createProjectDeletionOperation() { + return null; + } + + @Override + protected ProjectUrlOperation createProjectUrlOperation() { + return null; + } + +} diff --git a/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/publish/VisualisExportResponseRef.java b/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/publish/VisualisExportResponseRef.java new file mode 100644 index 000000000..42814d7a7 --- /dev/null +++ b/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/publish/VisualisExportResponseRef.java @@ -0,0 +1,36 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.visualis.publish; + +import com.webank.wedatasphere.dss.standard.app.development.ref.DSSCommonResponseRef; + +import java.util.Map; + +public class VisualisExportResponseRef extends DSSCommonResponseRef { + + Map bmlResource; + + public VisualisExportResponseRef(String responseBody) throws Exception { + super(responseBody); + bmlResource = ((Map) responseMap.get("data")); + } + + @Override + public Map toMap() { + return bmlResource; + } +} diff --git a/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/publish/VisualisImportResponseRef.java b/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/publish/VisualisImportResponseRef.java new file mode 100644 index 000000000..2f4a81b2c --- /dev/null +++ b/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/publish/VisualisImportResponseRef.java @@ -0,0 +1,58 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.visualis.publish; + +import com.google.common.collect.Maps; +import com.webank.wedatasphere.dss.standard.app.development.ref.DSSCommonResponseRef; +import com.webank.wedatasphere.dss.standard.common.exception.operation.ExternalOperationFailedException; + +import java.util.Map; + +public class VisualisImportResponseRef extends DSSCommonResponseRef { + + Map importedMap = Maps.newHashMap(); + Map newJobContent = Maps.newHashMap(); + + public VisualisImportResponseRef(Map jobContent, String responseBody, String nodeType, Object projectId) throws Exception { + super(responseBody); + + if("linkis.appconn.visualis.widget".equalsIgnoreCase(nodeType)){ + Map payload = (Map) jobContent.get("data"); + Long id = ((Double) Double.parseDouble(payload.get("widgetId").toString())).longValue(); + payload.put("widgetId", ((Double) ((Map) ((Map) responseMap.get("data")).get("widget")).get(id.toString()).doubleValue()).toString()); + } else if("linkis.appconn.visualis.display".equalsIgnoreCase(nodeType)){ + Map payload = (Map) jobContent.get("payload"); + Long id = ((Double) Double.parseDouble(payload.get("id").toString())).longValue(); + payload.put("projectId", projectId); + payload.put("id", ((Double) ((Map) ((Map) responseMap.get("data")).get("display")).get(id.toString()).doubleValue()).toString()); + } else if("linkis.appconn.visualis.dashboard".equalsIgnoreCase(nodeType)){ + Map payload = (Map) jobContent.get("payload"); + Long id = ((Double) Double.parseDouble(payload.get("id").toString())).longValue(); + payload.put("projectId", projectId); + payload.put("id", ((Double) ((Map) ((Map) responseMap.get("data")).get("dashboardPortal")).get(id.toString()).doubleValue()).toString()); + } else { + throw new ExternalOperationFailedException(90177, "Unknown task type " + nodeType, null); + } + this.newJobContent = jobContent; + } + + @Override + public Map toMap() { + return newJobContent; + } + +} diff --git a/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/ref/NodeUpdateCSRequestRefImpl.java b/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/ref/NodeUpdateCSRequestRefImpl.java new file mode 100644 index 000000000..7aa6210f1 --- /dev/null +++ b/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/ref/NodeUpdateCSRequestRefImpl.java @@ -0,0 +1,25 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.visualis.ref; + +import com.webank.wedatasphere.dss.standard.app.development.ref.UpdateCSRequestRef; +import com.webank.wedatasphere.dss.standard.app.development.ref.impl.NodeRequestRefImpl; + + +public class NodeUpdateCSRequestRefImpl extends NodeRequestRefImpl implements UpdateCSRequestRef { + +} diff --git a/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/ref/VisualisCommonResponseRef.java b/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/ref/VisualisCommonResponseRef.java new file mode 100644 index 000000000..37afbffda --- /dev/null +++ b/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/ref/VisualisCommonResponseRef.java @@ -0,0 +1,41 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.visualis.ref; + +import com.webank.wedatasphere.dss.appconn.visualis.utils.VisualisNodeUtils; +import com.webank.wedatasphere.dss.standard.app.development.ref.DSSCommonResponseRef; +import com.webank.wedatasphere.dss.standard.common.exception.operation.ExternalOperationFailedException; + +public class VisualisCommonResponseRef extends DSSCommonResponseRef { + + + public VisualisCommonResponseRef(String responseBody) throws Exception { + super(responseBody); + } + + public String getWidgetId() throws ExternalOperationFailedException { + return VisualisNodeUtils.getWidgetId(responseBody); + } + + public String getDisplayId() throws ExternalOperationFailedException { + return VisualisNodeUtils.getDisplayId(responseBody); + } + + public String getDashboardId() throws ExternalOperationFailedException { + return VisualisNodeUtils.getDashboardPortalId(responseBody); + } +} diff --git a/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/ref/VisualisOpenRequestRef.java b/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/ref/VisualisOpenRequestRef.java new file mode 100644 index 000000000..8de783392 --- /dev/null +++ b/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/ref/VisualisOpenRequestRef.java @@ -0,0 +1,39 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.visualis.ref; + +import com.webank.wedatasphere.dss.standard.app.development.ref.impl.CommonRequestRefImpl; +import com.webank.wedatasphere.dss.standard.app.development.ref.OpenRequestRef; + +import java.util.Map; + +public class VisualisOpenRequestRef extends CommonRequestRefImpl implements OpenRequestRef { + + @Override + public String getName() { + return ((Map)this.getParameters().get("params")).get("title").toString(); + } + + @Override + public String getType() { + return ((Map)this.getParameters().get("node")).get("nodeType").toString(); + } + + public Object getJobContent() { + return ((Map)this.getParameters().get("params")); + } +} diff --git a/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/ref/VisualisOpenResponseRef.java b/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/ref/VisualisOpenResponseRef.java new file mode 100644 index 000000000..d5e94eb3c --- /dev/null +++ b/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/ref/VisualisOpenResponseRef.java @@ -0,0 +1,27 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.visualis.ref; + +import com.webank.wedatasphere.dss.standard.app.development.ref.CommonResponseRef; + +public class VisualisOpenResponseRef extends CommonResponseRef { + + public VisualisOpenResponseRef(String responseBody, int status) { + super(responseBody, status); + } + +} diff --git a/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/service/VisualisCRUDService.java b/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/service/VisualisCRUDService.java new file mode 100644 index 000000000..45a2c8c06 --- /dev/null +++ b/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/service/VisualisCRUDService.java @@ -0,0 +1,50 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.visualis.service; + +import com.webank.wedatasphere.dss.appconn.visualis.operation.VisualisRefCreationOperation; +import com.webank.wedatasphere.dss.appconn.visualis.operation.VisualisRefDeletionOperation; +import com.webank.wedatasphere.dss.appconn.visualis.operation.VisualisRefUpdateOperation; +import com.webank.wedatasphere.dss.standard.app.development.operation.RefCopyOperation; +import com.webank.wedatasphere.dss.standard.app.development.operation.RefCreationOperation; +import com.webank.wedatasphere.dss.standard.app.development.operation.RefDeletionOperation; +import com.webank.wedatasphere.dss.standard.app.development.operation.RefUpdateOperation; +import com.webank.wedatasphere.dss.standard.app.development.service.AbstractRefCRUDService; + +public class VisualisCRUDService extends AbstractRefCRUDService { + + @Override + protected RefCreationOperation createRefCreationOperation() { + return new VisualisRefCreationOperation(this); + } + + @Override + protected RefCopyOperation createRefCopyOperation() { + return null; + } + + @Override + protected RefUpdateOperation createRefUpdateOperation() { + return new VisualisRefUpdateOperation(this); + } + + @Override + protected RefDeletionOperation createRefDeletionOperation() { + return new VisualisRefDeletionOperation(this); + } + +} diff --git a/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/service/VisualisQueryService.java b/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/service/VisualisQueryService.java new file mode 100644 index 000000000..2a710cc74 --- /dev/null +++ b/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/service/VisualisQueryService.java @@ -0,0 +1,30 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.visualis.service; + +import com.webank.wedatasphere.dss.appconn.visualis.operation.VisualisRefQueryOperation; +import com.webank.wedatasphere.dss.standard.app.development.service.AbstractRefQueryService; +import com.webank.wedatasphere.dss.standard.app.development.operation.RefQueryOperation; + +public class VisualisQueryService extends AbstractRefQueryService { + + @Override + public RefQueryOperation createRefQueryOperation() { + return new VisualisRefQueryOperation(); + } + +} diff --git a/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/service/VisualisRefExportService.java b/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/service/VisualisRefExportService.java new file mode 100644 index 000000000..539047004 --- /dev/null +++ b/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/service/VisualisRefExportService.java @@ -0,0 +1,30 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.visualis.service; + +import com.webank.wedatasphere.dss.appconn.visualis.operation.VisualisRefExportOperation; +import com.webank.wedatasphere.dss.standard.app.development.operation.RefExportOperation; +import com.webank.wedatasphere.dss.standard.app.development.service.AbstractRefExportService; + +public class VisualisRefExportService extends AbstractRefExportService { + + @Override + public RefExportOperation createRefExportOperation() { + return new VisualisRefExportOperation(this); + } + +} diff --git a/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/service/VisualisRefImportService.java b/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/service/VisualisRefImportService.java new file mode 100644 index 000000000..ee2ceb839 --- /dev/null +++ b/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/service/VisualisRefImportService.java @@ -0,0 +1,31 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.visualis.service; + +import com.webank.wedatasphere.dss.appconn.visualis.operation.VisualisRefImportOperation; +import com.webank.wedatasphere.dss.standard.app.development.service.AbstractRefImportService; +import com.webank.wedatasphere.dss.standard.app.development.operation.RefImportOperation; + +public class VisualisRefImportService extends AbstractRefImportService { + + @Override + protected RefImportOperation createRefImportOperation() { + VisualisRefImportOperation visualisRefImportOperation = new VisualisRefImportOperation(this); + return visualisRefImportOperation; + } + +} diff --git a/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/utils/HttpUtils.java b/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/utils/HttpUtils.java new file mode 100644 index 000000000..11db0ae6c --- /dev/null +++ b/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/utils/HttpUtils.java @@ -0,0 +1,133 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.visualis.utils; + +import org.apache.commons.io.IOUtils; +import org.apache.http.HttpEntity; +import org.apache.http.client.CookieStore; +import org.apache.http.client.methods.CloseableHttpResponse; +import org.apache.http.client.methods.HttpDelete; +import org.apache.http.client.methods.HttpPost; +import org.apache.http.client.methods.HttpPut; +import org.apache.http.entity.StringEntity; +import org.apache.http.impl.client.BasicCookieStore; +import org.apache.http.impl.client.CloseableHttpClient; +import org.apache.http.impl.client.HttpClients; +import org.apache.http.protocol.HTTP; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; + +import java.io.UnsupportedEncodingException; + +public class HttpUtils { + private static Logger logger = LoggerFactory.getLogger(HttpUtils.class); + public static String sendPostReq(String url, String params, + String user) throws Exception { + String resultString = "{}"; + logger.info("sendPostReq url is: "+url); + HttpPost httpPost = new HttpPost(url); + httpPost.addHeader(HTTP.CONTENT_ENCODING, "UTF-8"); + httpPost.addHeader("Token-User",user); + httpPost.addHeader("Token-Code", "WS-AUTH"); + CookieStore cookieStore = new BasicCookieStore(); + logger.info("Http request params is :"+params); + StringEntity entity = entity = new StringEntity(params); + entity.setContentEncoding("UTF-8"); + entity.setContentType("application/json"); + httpPost.setEntity(entity); + CloseableHttpClient httpClient = null; + CloseableHttpResponse response = null; + try { + httpClient = HttpClients.custom().setDefaultCookieStore(cookieStore).build(); + response = httpClient.execute(httpPost); + HttpEntity ent = response.getEntity(); + resultString = IOUtils.toString(ent.getContent(), "utf-8"); + logger.info("Send Http Request Success", resultString); + } catch (Exception e) { + logger.error("Send Http Request Failed", e); + throw e; + } finally { + IOUtils.closeQuietly(response); + IOUtils.closeQuietly(httpClient); + } + return resultString; + } + + public static String sendHttpDelete(String url,String user) throws Exception { + String resultString = "{}"; + HttpDelete httpdelete = new HttpDelete(url); + logger.info("sendDeleteReq url is: "+url); + httpdelete.addHeader(HTTP.CONTENT_ENCODING, "UTF-8"); + httpdelete.addHeader("Token-User",user); + httpdelete.addHeader("Token-Code","WS-AUTH"); + CookieStore cookieStore = new BasicCookieStore(); + CloseableHttpClient httpClient = null; + CloseableHttpResponse response = null; + try { + httpClient = HttpClients.custom().setDefaultCookieStore(cookieStore).build(); + response = httpClient.execute(httpdelete); + HttpEntity ent = response.getEntity(); + resultString = IOUtils.toString(ent.getContent(), "utf-8"); + logger.info("Send Http Delete Request Success", resultString); + } catch (Exception e) { + logger.error("Send Http Delete Request Failed", e); + throw e; + }finally{ + IOUtils.closeQuietly(response); + IOUtils.closeQuietly(httpClient); + } + return resultString; + } + + public static String sendHttpPut(String url, String params, + String user) throws Exception { + String resultString = "{}"; + logger.info("sendPostReq url is: "+url); + HttpPut httpPut = new HttpPut(url); + httpPut.addHeader(HTTP.CONTENT_ENCODING, "UTF-8"); + httpPut.addHeader("Token-User",user); + httpPut.addHeader("Token-Code","WS-AUTH"); + CookieStore cookieStore = new BasicCookieStore(); + logger.info("Http put params is :"+params); + StringEntity entity = null; + try { + entity = new StringEntity(params); + } catch (UnsupportedEncodingException e) { + throw e; + } + entity.setContentEncoding("UTF-8"); + entity.setContentType("application/json"); + httpPut.setEntity(entity); + CloseableHttpClient httpClient = null; + CloseableHttpResponse response = null; + try { + httpClient = HttpClients.custom().setDefaultCookieStore(cookieStore).build(); + response = httpClient.execute(httpPut); + HttpEntity ent = response.getEntity(); + resultString = IOUtils.toString(ent.getContent(), "utf-8"); + logger.info("Send Http Put Success", resultString); + } catch (Exception e) { + logger.error("Send Http Put Failed", e); + throw e; + } finally { + IOUtils.closeQuietly(response); + IOUtils.closeQuietly(httpClient); + } + return resultString; + } + +} diff --git a/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/utils/NumberUtils.java b/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/utils/NumberUtils.java new file mode 100644 index 000000000..2dea0fb60 --- /dev/null +++ b/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/utils/NumberUtils.java @@ -0,0 +1,34 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.visualis.utils; + +public class NumberUtils { + + public static Integer getInt(Object original){ + if(original instanceof Double){ + return ((Double) original).intValue(); + } + return (Integer) original; + } + + public static String parseDoubleString(String doubleString) { + Double doubleValue = Double.parseDouble(doubleString); + Integer intValue = doubleValue.intValue(); + return intValue.toString(); + } + +} diff --git a/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/utils/URLUtils.java b/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/utils/URLUtils.java new file mode 100644 index 000000000..8ed8064f9 --- /dev/null +++ b/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/utils/URLUtils.java @@ -0,0 +1,50 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.visualis.utils; + +import com.webank.wedatasphere.linkis.server.conf.ServerConfiguration; + +public class URLUtils { + public final static String widgetUrl = "/api/rest_j/" + ServerConfiguration.BDP_SERVER_VERSION() + "/visualis/widget" + "/smartcreate"; + public final static String widgetUpdateUrl = "/api/rest_j/" + ServerConfiguration.BDP_SERVER_VERSION() + "/visualis/widget" + "/rename"; + public final static String widgetContextUrl = "/api/rest_j/" + ServerConfiguration.BDP_SERVER_VERSION() + "/visualis/widget" + "/setcontext"; + public final static String widgetDeleteUrl = "/api/rest_s/" + ServerConfiguration.BDP_SERVER_VERSION() + "/visualis/widgets"; + public final static String displayUrl = "/api/rest_s/" + ServerConfiguration.BDP_SERVER_VERSION() + "/visualis/displays"; + public final static String dashboardPortalUrl = "/api/rest_s/" + ServerConfiguration.BDP_SERVER_VERSION() + "/visualis/dashboardPortals"; + public final static String displaySlideConfig = "{\"slideParams\":{\"width\":1920,\"height\":1080,\"backgroundColor\":[255,255,255],\"scaleMode\":\"noScale\",\"backgroundImage\":null}}"; + public final static String projectUrl = "/api/rest_j/" + ServerConfiguration.BDP_SERVER_VERSION() + "/visualis/project"; + + public final static String DISPLAY_PREVIEW_URL_FORMAT = "/api/rest_s/" + ServerConfiguration.BDP_SERVER_VERSION() + "/visualis/displays/%s/preview"; + public final static String DASHBOARD_PREVIEW_URL_FORMAT = "/api/rest_s/" + ServerConfiguration.BDP_SERVER_VERSION() + "/visualis/dashboard/portal/%s/preview"; + public final static String WIDGET_DATA_URL_FORMAT = "/api/rest_j/" + ServerConfiguration.BDP_SERVER_VERSION() + "/visualis/widget/%s/getdata"; + public final static String DISPLAY_METADATA_URL_FORMAT = "/api/rest_j/" + ServerConfiguration.BDP_SERVER_VERSION() + "/visualis/widget/display/%s/metadata"; + public final static String DASHBOARD_METADATA_URL_FORMAT = "/api/rest_j/" + ServerConfiguration.BDP_SERVER_VERSION() + "/visualis/widget/portal/%s/metadata"; + + public final static String WIDGET_JUMP_URL_FORMAT = "dss/visualis/#/project/%s/widget/%s"; + public final static String DISPLAY_JUMP_URL_FORMAT = "dss/visualis/#/project/%s/display/%s"; + public final static String DASHBOARD_JUMP_URL_FORMAT = "dss/visualis/#/project/%s/portal/%s/portalName/%s"; + + + public static String getUrl(String baseUrl, String format, String entityId){ + return baseUrl + String.format(format, entityId); + } + + public static String getUrl(String baseUrl, String format, String... ids){ + return baseUrl + String.format(format, ids); + } + +} diff --git a/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/utils/VisualisDownloadAction.java b/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/utils/VisualisDownloadAction.java new file mode 100644 index 000000000..826957eb3 --- /dev/null +++ b/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/utils/VisualisDownloadAction.java @@ -0,0 +1,58 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.visualis.utils; + +import com.webank.wedatasphere.linkis.httpclient.request.DownloadAction; +import com.webank.wedatasphere.linkis.httpclient.request.GetAction; +import com.webank.wedatasphere.linkis.httpclient.request.UserAction; + +import java.io.InputStream; + +public class VisualisDownloadAction extends GetAction implements DownloadAction, UserAction { + + private String user; + private String url; + private InputStream inputStream; + + @Override + public void write(InputStream inputStream) { + this.inputStream = inputStream; + } + + public InputStream getInputStream() { + return inputStream; + } + + @Override + public String getUser() { + return user; + } + + @Override + public void setUser(String user) { + this.user = user; + } + + public void setURL(String url) { + this.url = url; + } + + @Override + public String getURL() { + return url; + } +} diff --git a/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/utils/VisualisNodeUtils.java b/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/utils/VisualisNodeUtils.java new file mode 100644 index 000000000..4eca9e9dc --- /dev/null +++ b/dss-appconn/appconns/dss-visualis-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/visualis/utils/VisualisNodeUtils.java @@ -0,0 +1,84 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.visualis.utils; + +import com.fasterxml.jackson.core.JsonProcessingException; +import com.fasterxml.jackson.databind.JsonMappingException; +import com.webank.wedatasphere.dss.standard.app.development.ref.NodeRequestRef; +import com.webank.wedatasphere.dss.standard.common.exception.operation.ExternalOperationFailedException; +import com.webank.wedatasphere.linkis.server.BDPJettyServerHelper; + +import java.util.Map; + +public class VisualisNodeUtils { + + public static String getId(NodeRequestRef nodeRequestRef) throws Exception { + String externalContent = BDPJettyServerHelper.jacksonJson().writeValueAsString(nodeRequestRef.getJobContent()); + if ("linkis.appconn.visualis.display".equalsIgnoreCase(nodeRequestRef.getNodeType())) { + return NumberUtils.parseDoubleString(getDisplayId(externalContent)); + } else if ("linkis.appconn.visualis.dashboard".equalsIgnoreCase(nodeRequestRef.getNodeType())) { + return NumberUtils.parseDoubleString(getDashboardPortalId(externalContent)); + } else if ("linkis.appconn.visualis.widget".equalsIgnoreCase(nodeRequestRef.getNodeType())) { + return NumberUtils.parseDoubleString(getWidgetId(externalContent)); + } + return null; + } + + + public static String getDisplayId(String responseBody) throws ExternalOperationFailedException { + String displayId = null; + try { + Map responseMap = BDPJettyServerHelper.jacksonJson().readValue(responseBody, Map.class); + displayId = ((Map) responseMap.get("payload")).get("id").toString(); + } catch (JsonMappingException e) { + throw new ExternalOperationFailedException(1000054, "Get Display Id failed!", e); + } catch (JsonProcessingException e) { + throw new ExternalOperationFailedException(1000054, "Get Display Id failed!", e); + } + + return displayId; + } + + public static String getWidgetId(String responseBody) throws ExternalOperationFailedException { + String widgetId = null; + try { + Map responseMap = BDPJettyServerHelper.jacksonJson().readValue(responseBody, Map.class); + widgetId = ((Map) responseMap.get("data")).get("widgetId").toString(); + } catch (JsonMappingException e) { + throw new ExternalOperationFailedException(1000055, "Get widget Id failed!", e); + } catch (JsonProcessingException e) { + throw new ExternalOperationFailedException(1000055, "Get widget Id failed!", e); + } + return widgetId; + } + + public static String getDashboardPortalId(String responseBody) throws ExternalOperationFailedException { + String dashboardPortalId = null; + try { + Map responseMap = BDPJettyServerHelper.jacksonJson().readValue(responseBody, Map.class); + dashboardPortalId = ((Map) responseMap.get("payload")).get("id").toString(); + } catch (JsonMappingException e) { + throw new ExternalOperationFailedException(1000056, "Get dashboard Id failed!", e); + } catch (JsonProcessingException e) { + throw new ExternalOperationFailedException(1000056, "Get dashboard Id failed!", e); + } + + return Long.toString(Math.round(Double.parseDouble(dashboardPortalId))); + } + + +} diff --git a/dss-appconn/appconns/dss-visualis-appconn/src/main/resources/init.sql b/dss-appconn/appconns/dss-visualis-appconn/src/main/resources/init.sql new file mode 100644 index 000000000..667e818a2 --- /dev/null +++ b/dss-appconn/appconns/dss-visualis-appconn/src/main/resources/init.sql @@ -0,0 +1,78 @@ + +delete from `dss_application` WHERE `name` ='visualis'; +INSERT INTO `dss_application`(`name`,`url`,`is_user_need_init`,`level`,`user_init_url`,`exists_project_service`,`project_url`,`enhance_json`,`if_iframe`,`homepage_url`,`redirect_url`) VALUES ('visualis','',0,1,NULL,0,'','{\"watermark\":false,\"rsDownload\":true}',1,'',NULL); + +UPDATE `dss_application` SET url = 'http://APPCONN_INSTALL_IP:APPCONN_INSTALL_PORT' WHERE `name` ='visualis'; +UPDATE `dss_application` SET project_url = 'http://APPCONN_INSTALL_IP:APPCONN_INSTALL_PORT/dss/visualis/#/project/${projectId}',homepage_url = 'http://APPCONN_INSTALL_IP:APPCONN_INSTALL_PORT/dss/visualis/#/projects' WHERE `name` in('visualis'); + +select @dss_visualis_applicationId:=id from `dss_application` WHERE `name` ='visualis'; + +delete from `dss_menu` WHERE `name` ='visualis'; +INSERT INTO `dss_menu`(`name`,`level`,`upper_menu_id`,`front_name`,`comment`,`description`,`is_active`,`is_component`,`icon`,`application_id`) values ('visualis','2',3,'知画(Visualis)',NULL,NULL,1,1,NULL,@dss_visualis_applicationId); + +delete from `dss_onestop_menu_application` WHERE title_en='Visualis'; +INSERT INTO `dss_onestop_menu_application` (`application_id`, `onestop_menu_id`, `title_en`, `title_cn`, `desc_en`, `desc_cn`, `labels_en`, `labels_cn`, `is_active`, `access_button_en`, `access_button_cn`, `manual_button_en`, `manual_button_cn`, `manual_button_url`, `icon`, `order`, `create_by`, `create_time`, `last_update_time`, `last_update_user`, `image`) VALUES(@dss_visualis_applicationId,'2','Visualis','Visualis','Visualis is a data visualization BI tool based on Davinci, with Linkis as the kernel, it supports the analysis mode of data development exploration.','Visualis是基于宜信开源项目Davinci开发的数据可视化BI工具,以任意桥(Linkis)做为内核,支持拖拽式报表定义、图表联动、钻取、全局筛选、多维分析、实时查询等数据开发探索的分析模式,并做了水印、数据质量校验等金融级增强。','visualization, statement','可视化,报表','1','enter Visualis','进入Visualis','user manual','用户手册','http://127.0.0.1:8088/wiki/scriptis/manual/workspace_cn.html','shujukeshihua-logo',NULL,NULL,NULL,NULL,NULL,'shujukeshihua-icon'); + + +delete from `dss_appconn` where `appconn_name`='visualis'; +INSERT INTO `dss_appconn` (`appconn_name`, `is_user_need_init`, `level`, `if_iframe`, `is_external`, `reference`, `class_name`, `appconn_class_path`, `resource`) VALUES ('visualis', 0, 1, NULL, 0, NULL, 'com.webank.wedatasphere.dss.appconn.visualis.VisualisAppConn', 'DSS_INSTALL_HOME_VAL/dss-appconns/visualis/lib', ''); + +select @dss_appconn_visualisId:=id from `dss_appconn` where `appconn_name` = 'visualis'; + +delete from `dss_appconn_instance` where `homepage_url` like '%visualis%'; +INSERT INTO `dss_appconn_instance` (`appconn_id`, `label`, `url`, `enhance_json`, `homepage_url`, `redirect_url`) VALUES (@dss_appconn_visualisId, 'DEV', 'http://APPCONN_INSTALL_IP:APPCONN_INSTALL_PORT/', '', 'http://APPCONN_INSTALL_IP:APPCONN_INSTALL_PORT/dss/visualis/#/projects', 'http://APPCONN_INSTALL_IP:APPCONN_INSTALL_PORT/'); + + +delete from `dss_workflow_node` where `node_type` like '%visualis%'; +insert into `dss_workflow_node` (`name`, `appconn_name`, `node_type`, `jump_url`, `support_jump`, `submit_to_scheduler`, `enable_copy`, `should_creation_before_node`, `icon`) values('display','visualis','linkis.appconn.visualis.display','http://APPCONN_INSTALL_IP:APPCONN_INSTALL_PORT/dss/visualis/#/project/${projectId}/display/${displayId}','1','1','0','1','displayCreated with Sketch.'); +insert into `dss_workflow_node` (`name`, `appconn_name`, `node_type`, `jump_url`, `support_jump`, `submit_to_scheduler`, `enable_copy`, `should_creation_before_node`, `icon`) values('dashboard','visualis','linkis.appconn.visualis.dashboard','http://APPCONN_INSTALL_IP:APPCONN_INSTALL_PORT/dss/visualis/#/project/${projectId}/portal/${dashboardPortalId}/portalName/${nodeName}','1','1','0','1','dashboardCreated with Sketch.'); +insert into `dss_workflow_node` (`name`, `appconn_name`, `node_type`, `jump_url`, `support_jump`, `submit_to_scheduler`, `enable_copy`, `should_creation_before_node`, `icon`) values('widget','visualis','linkis.appconn.visualis.widget','http://APPCONN_INSTALL_IP:APPCONN_INSTALL_PORT/dss/visualis/#/project/${projectId}/widget/${widgetId}','1','1','0','1','widgetCreated with Sketch.'); + + + +select @dss_visualis_displayId:=id from `dss_workflow_node` where `node_type` = 'linkis.appconn.visualis.display'; +select @dss_visualis_dashboardId:=id from `dss_workflow_node` where `node_type` = 'linkis.appconn.visualis.dashboard'; +select @dss_visualis_dashwidgetId:=id from `dss_workflow_node` where `node_type` = 'linkis.appconn.visualis.widget'; + +delete from `dss_workflow_node_to_group` where `node_id`=@dss_visualis_displayId; +delete from `dss_workflow_node_to_group` where `node_id`=@dss_visualis_dashboardId; +delete from `dss_workflow_node_to_group` where `node_id`=@dss_visualis_dashwidgetId; + + +delete from `dss_workflow_node_to_ui` where `workflow_node_id`=@dss_visualis_displayId; +delete from `dss_workflow_node_to_ui` where `workflow_node_id`=@dss_visualis_dashboardId; +delete from `dss_workflow_node_to_ui` where `workflow_node_id`=@dss_visualis_dashwidgetId; + + +select @dss_visualis_displayId:=id from `dss_workflow_node` where `node_type` = 'linkis.appconn.visualis.display'; +select @dss_visualis_dashboardId:=id from `dss_workflow_node` where `node_type` = 'linkis.appconn.visualis.dashboard'; +select @dss_visualis_dashwidgetId:=id from `dss_workflow_node` where `node_type` = 'linkis.appconn.visualis.widget'; + +INSERT INTO `dss_workflow_node_to_group`(`node_id`,`group_id`) values (@dss_visualis_displayId,4); +INSERT INTO `dss_workflow_node_to_group`(`node_id`,`group_id`) values (@dss_visualis_dashboardId,4); +INSERT INTO `dss_workflow_node_to_group`(`node_id`,`group_id`) values (@dss_visualis_dashwidgetId,4); + + +INSERT INTO `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (@dss_visualis_displayId,1); +INSERT INTO `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (@dss_visualis_displayId,2); +INSERT INTO `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (@dss_visualis_displayId,3); +INSERT INTO `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (@dss_visualis_displayId,4); +INSERT INTO `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (@dss_visualis_displayId,5); +INSERT INTO `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (@dss_visualis_displayId,6); +INSERT INTO `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (@dss_visualis_displayId,45); + +INSERT INTO `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (@dss_visualis_dashboardId,1); +INSERT INTO `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (@dss_visualis_dashboardId,2); +INSERT INTO `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (@dss_visualis_dashboardId,3); +INSERT INTO `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (@dss_visualis_dashboardId,4); +INSERT INTO `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (@dss_visualis_dashboardId,5); +INSERT INTO `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (@dss_visualis_dashboardId,45); + + +INSERT INTO `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (@dss_visualis_dashwidgetId,1); +INSERT INTO `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (@dss_visualis_dashwidgetId,3); +INSERT INTO `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (@dss_visualis_dashwidgetId,5); +INSERT INTO `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (@dss_visualis_dashwidgetId,6); +INSERT INTO `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (@dss_visualis_dashwidgetId,37); +INSERT INTO `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (@dss_visualis_dashwidgetId,2); +INSERT INTO `dss_workflow_node_to_ui`(`workflow_node_id`,`ui_id`) values (@dss_visualis_dashwidgetId,45); diff --git a/dss-appconn/appconns/dss-workflow-appconn/pom.xml b/dss-appconn/appconns/dss-workflow-appconn/pom.xml new file mode 100644 index 000000000..5cf95329b --- /dev/null +++ b/dss-appconn/appconns/dss-workflow-appconn/pom.xml @@ -0,0 +1,112 @@ + + + + + + dss + com.webank.wedatasphere.dss + 1.0.0 + ../../../pom.xml + + 4.0.0 + + dss-workflow-appconn + + + com.webank.wedatasphere.dss + dss-development-process-standard + ${dss.version} + provided + + + com.webank.wedatasphere.dss + dss-appconn-core + ${dss.version} + provided + + + com.webank.wedatasphere.dss + dss-orchestrator-core + ${dss.version} + provided + + + + com.webank.wedatasphere.dss + dss-common + ${dss.version} + provided + + + com.webank.wedatasphere.dss + dss-sender-service + ${dss.version} + provided + + + + + + + + + org.apache.maven.plugins + maven-deploy-plugin + + + net.alchim31.maven + scala-maven-plugin + + + org.apache.maven.plugins + maven-jar-plugin + + + org.apache.maven.plugins + maven-assembly-plugin + 2.3 + false + + + make-assembly + package + + single + + + + src/main/assembly/distribution.xml + + + + + + false + out + false + false + + src/main/assembly/distribution.xml + + + + + + + \ No newline at end of file diff --git a/dss-appconn/appconns/dss-workflow-appconn/src/main/assembly/distribution.xml b/dss-appconn/appconns/dss-workflow-appconn/src/main/assembly/distribution.xml new file mode 100644 index 000000000..003e51313 --- /dev/null +++ b/dss-appconn/appconns/dss-workflow-appconn/src/main/assembly/distribution.xml @@ -0,0 +1,66 @@ + + + + dss-workflow-appconn + + dir + + true + workflow + + + + + + lib + true + true + false + true + true + + + + + + ${basedir}/src/main/resources + + appconn.properties + + 0777 + / + unix + + + + ${basedir}/src/main/resources + + log4j.properties + log4j2.xml + + 0777 + conf + unix + + + + + + diff --git a/dss-appconn/appconns/dss-workflow-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/workflow/DefaultWorkflowAppConn.java b/dss-appconn/appconns/dss-workflow-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/workflow/DefaultWorkflowAppConn.java new file mode 100644 index 000000000..8198af3b9 --- /dev/null +++ b/dss-appconn/appconns/dss-workflow-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/workflow/DefaultWorkflowAppConn.java @@ -0,0 +1,37 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.workflow; + +import com.webank.wedatasphere.dss.appconn.core.ext.OnlyDevelopmentAppConn; +import com.webank.wedatasphere.dss.appconn.core.impl.AbstractAppConn; +import com.webank.wedatasphere.dss.standard.app.development.standard.DevelopmentIntegrationStandard; +import com.webank.wedatasphere.dss.standard.common.desc.AppDescImpl; + +public class DefaultWorkflowAppConn extends AbstractAppConn implements OnlyDevelopmentAppConn { + + @Override + protected void initialize() { + AppDescImpl appDesc = new AppDescImpl(); + appDesc.setAppName("WorkflowAppConn"); + setAppDesc(appDesc); + } + + @Override + public DevelopmentIntegrationStandard getOrCreateDevelopmentStandard() { + return new WorkflowDevelopmentIntegrationStandard(); + } +} diff --git a/dss-appconn/appconns/dss-workflow-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/workflow/WorkflowDevelopmentIntegrationStandard.java b/dss-appconn/appconns/dss-workflow-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/workflow/WorkflowDevelopmentIntegrationStandard.java new file mode 100644 index 000000000..f93d4c396 --- /dev/null +++ b/dss-appconn/appconns/dss-workflow-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/workflow/WorkflowDevelopmentIntegrationStandard.java @@ -0,0 +1,53 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.workflow; + +import com.webank.wedatasphere.dss.appconn.workflow.service.WorkflowCRUDService; +import com.webank.wedatasphere.dss.appconn.workflow.service.WorkflowExportService; +import com.webank.wedatasphere.dss.appconn.workflow.service.WorkflowImportService; +import com.webank.wedatasphere.dss.appconn.workflow.service.WorkflowQueryService; +import com.webank.wedatasphere.dss.standard.app.development.service.*; +import com.webank.wedatasphere.dss.standard.app.development.standard.AbstractDevelopmentIntegrationStandard; + +public class WorkflowDevelopmentIntegrationStandard extends AbstractDevelopmentIntegrationStandard { + + @Override + protected RefCRUDService createRefCRUDService() { + return new WorkflowCRUDService(); + } + + @Override + protected RefExecutionService createRefExecutionService() { + return null; + } + + @Override + protected RefExportService createRefExportService() { + return new WorkflowExportService(); + } + + @Override + protected RefImportService createRefImportService() { + return new WorkflowImportService(); + } + + @Override + protected RefQueryService createRefQueryService() { + return new WorkflowQueryService(); + } + +} diff --git a/dss-appconn/appconns/dss-workflow-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/workflow/opertion/WorkflowRefExportOperation.java b/dss-appconn/appconns/dss-workflow-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/workflow/opertion/WorkflowRefExportOperation.java new file mode 100644 index 000000000..23ce611d7 --- /dev/null +++ b/dss-appconn/appconns/dss-workflow-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/workflow/opertion/WorkflowRefExportOperation.java @@ -0,0 +1,80 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.workflow.opertion; + +import com.webank.wedatasphere.dss.appconn.workflow.ref.WorkflowExportRequestRef; +import com.webank.wedatasphere.dss.appconn.workflow.ref.WorkflowExportResponseRef; +import com.webank.wedatasphere.dss.common.protocol.RequestExportWorkflow; +import com.webank.wedatasphere.dss.common.protocol.ResponseExportWorkflow; +import com.webank.wedatasphere.dss.common.utils.DSSExceptionUtils; +import com.webank.wedatasphere.dss.sender.service.DSSSenderServiceFactory; +import com.webank.wedatasphere.dss.standard.app.development.service.DevelopmentService; +import com.webank.wedatasphere.dss.standard.app.development.operation.RefExportOperation; +import com.webank.wedatasphere.dss.standard.common.exception.operation.ExternalOperationFailedException; +import com.webank.wedatasphere.linkis.rpc.Sender; +import com.webank.wedatasphere.linkis.server.BDPJettyServerHelper; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; + + +public class WorkflowRefExportOperation implements RefExportOperation { + private static final Logger LOGGER = LoggerFactory.getLogger(WorkflowRefExportOperation.class); + + private DevelopmentService developmentService; + + + @Override + public WorkflowExportResponseRef exportRef(WorkflowExportRequestRef workflowExportRequestRef) throws ExternalOperationFailedException { + + String userName = workflowExportRequestRef.getUserName(); + //todo + long flowId = workflowExportRequestRef.getAppId(); + Long projectId = workflowExportRequestRef.getProjectId(); + String projectName = workflowExportRequestRef.getProjectName(); + RequestExportWorkflow requestExportWorkflow = new RequestExportWorkflow(userName, + flowId, + projectId, + projectName, + BDPJettyServerHelper.gson().toJson(workflowExportRequestRef.getWorkspace()), + workflowExportRequestRef.getDSSLabels()); + ResponseExportWorkflow responseExportWorkflow = null; + try{ + Sender sender = DSSSenderServiceFactory.getOrCreateServiceInstance().getWorkflowSender(workflowExportRequestRef.getDSSLabels()); + responseExportWorkflow = (ResponseExportWorkflow) sender.ask(requestExportWorkflow); + }catch(final Exception t){ + DSSExceptionUtils.dealErrorException(60025, "failed to get rpc message", t, ExternalOperationFailedException.class); + } + if (null != responseExportWorkflow) { + WorkflowExportResponseRef workflowExportResponseRef = new WorkflowExportResponseRef(); + workflowExportResponseRef.setFlowID(responseExportWorkflow.flowID()); + workflowExportResponseRef.setResourceId(responseExportWorkflow.resourceId()); + workflowExportResponseRef.setVersion(responseExportWorkflow.version()); + workflowExportResponseRef.addResponse("resourceId", responseExportWorkflow.resourceId()); + workflowExportResponseRef.addResponse("version", responseExportWorkflow.version()); + workflowExportResponseRef.addResponse("flowID", responseExportWorkflow.flowID()); + return workflowExportResponseRef; + } else { + throw new ExternalOperationFailedException(100085, "Error ask workflow to export!", null); + } + } + + @Override + public void setDevelopmentService(DevelopmentService service) { + this.developmentService = service; + } + +} diff --git a/dss-appconn/appconns/dss-workflow-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/workflow/opertion/WorkflowRefImportOperation.java b/dss-appconn/appconns/dss-workflow-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/workflow/opertion/WorkflowRefImportOperation.java new file mode 100644 index 000000000..f26266e1b --- /dev/null +++ b/dss-appconn/appconns/dss-workflow-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/workflow/opertion/WorkflowRefImportOperation.java @@ -0,0 +1,71 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.workflow.opertion; + +import com.webank.wedatasphere.dss.appconn.workflow.ref.WorkflowImportResponseRef; +import com.webank.wedatasphere.dss.common.utils.DSSCommonUtils; +import com.webank.wedatasphere.dss.sender.service.DSSSenderServiceFactory; +import com.webank.wedatasphere.dss.standard.app.development.service.DevelopmentService; +import com.webank.wedatasphere.dss.standard.app.development.operation.RefImportOperation; +import com.webank.wedatasphere.dss.standard.common.exception.operation.ExternalOperationFailedException; +import com.webank.wedatasphere.dss.appconn.workflow.ref.WorkflowImportRequestRef; +import com.webank.wedatasphere.dss.workflow.common.protocol.RequestImportWorkflow; +import com.webank.wedatasphere.dss.workflow.common.protocol.ResponseImportWorkflow; +import com.webank.wedatasphere.linkis.rpc.Sender; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; + +public class WorkflowRefImportOperation implements + RefImportOperation { + + private static final Logger LOGGER = LoggerFactory.getLogger(WorkflowRefImportOperation.class); + + private DevelopmentService developmentService; + + @Override + public void setDevelopmentService(DevelopmentService service) { + this.developmentService = service; + } + + @Override + public WorkflowImportResponseRef importRef(WorkflowImportRequestRef requestRef) throws ExternalOperationFailedException { + + WorkflowImportResponseRef workflowImportResponseRef = null; + RequestImportWorkflow requestImportWorkflow = new RequestImportWorkflow(requestRef.getUserName(), + requestRef.getResourceId(), requestRef.getBmlVersion(), + requestRef.getProjectId(), requestRef.getProjectName(), + requestRef.getSourceEnv(), requestRef.getOrcVersion(), + requestRef.getWorkspaceName(), + DSSCommonUtils.COMMON_GSON.toJson(requestRef.getWorkspace()), + requestRef.getContextID()); + + Sender sender = DSSSenderServiceFactory.getOrCreateServiceInstance().getSchedulerWorkflowSender(); + if (null != sender) { + ResponseImportWorkflow responseImportWorkflow = (ResponseImportWorkflow) sender.ask(requestImportWorkflow); + workflowImportResponseRef = new WorkflowImportResponseRef(); + if (responseImportWorkflow.getWorkflowIds() != null && responseImportWorkflow.getWorkflowIds().size() > 0){ + workflowImportResponseRef.setOrcId(responseImportWorkflow.getWorkflowIds().get(0)); + }else{ + LOGGER.error("failed to get ref orc id, workflow Ids are {}", responseImportWorkflow.getWorkflowIds()); + } + workflowImportResponseRef.setStatus(responseImportWorkflow.getStatus()); + } else { + throw new ExternalOperationFailedException(100039, "Rpc sender is null", null); + } + return workflowImportResponseRef; + } +} diff --git a/dss-appconn/appconns/dss-workflow-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/workflow/opertion/WorkflowRefQueryOperation.java b/dss-appconn/appconns/dss-workflow-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/workflow/opertion/WorkflowRefQueryOperation.java new file mode 100644 index 000000000..b5fb79ac6 --- /dev/null +++ b/dss-appconn/appconns/dss-workflow-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/workflow/opertion/WorkflowRefQueryOperation.java @@ -0,0 +1,52 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.workflow.opertion; + +import com.webank.wedatasphere.dss.appconn.workflow.ref.WorkflowUrlResponseRef; +import com.webank.wedatasphere.dss.common.exception.DSSRuntimeException; +import com.webank.wedatasphere.dss.common.label.EnvDSSLabel; +import com.webank.wedatasphere.dss.standard.app.development.operation.RefQueryOperation; +import com.webank.wedatasphere.dss.standard.app.development.ref.OpenRequestRef; +import com.webank.wedatasphere.dss.standard.app.development.service.DevelopmentService; +import com.webank.wedatasphere.dss.standard.common.exception.operation.ExternalOperationFailedException; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; + +public class WorkflowRefQueryOperation implements RefQueryOperation { + + protected final Logger LOGGER = LoggerFactory.getLogger(getClass()); + + private DevelopmentService developmentService; + + @Override + public WorkflowUrlResponseRef query(OpenRequestRef ref) throws ExternalOperationFailedException { + EnvDSSLabel label = (EnvDSSLabel) ref.getDSSLabels().stream().filter(EnvDSSLabel.class::isInstance) + .findFirst().orElseThrow(() -> new DSSRuntimeException(50321, "Not exists EnvDSSLabel.")); + String urlStr = "router/workflow/editable?labels=" + label.getEnv(); + if (LOGGER.isDebugEnabled()){ + LOGGER.debug("url for {} is {}", ref.getName(), urlStr); + } + WorkflowUrlResponseRef workflowUrlResponseRef = new WorkflowUrlResponseRef(); + workflowUrlResponseRef.setUrl(urlStr); + return workflowUrlResponseRef; + } + + @Override + public void setDevelopmentService(DevelopmentService service) { + this.developmentService =service; + } +} diff --git a/dss-appconn/appconns/dss-workflow-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/workflow/opertion/WorkflowTaskCopyOperation.java b/dss-appconn/appconns/dss-workflow-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/workflow/opertion/WorkflowTaskCopyOperation.java new file mode 100644 index 000000000..7d1839ce4 --- /dev/null +++ b/dss-appconn/appconns/dss-workflow-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/workflow/opertion/WorkflowTaskCopyOperation.java @@ -0,0 +1,59 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.workflow.opertion; + +import com.webank.wedatasphere.dss.appconn.workflow.ref.WorkflowCopyRequestRef; +import com.webank.wedatasphere.dss.appconn.workflow.ref.WorkflowCopyResponseRef; +import com.webank.wedatasphere.dss.sender.service.DSSSenderServiceFactory; +import com.webank.wedatasphere.dss.standard.app.development.operation.RefCopyOperation; +import com.webank.wedatasphere.dss.standard.app.development.service.DevelopmentService; +import com.webank.wedatasphere.dss.workflow.common.protocol.RequestCopyWorkflow; +import com.webank.wedatasphere.dss.workflow.common.protocol.ResponseCopyWorkflow; +import com.webank.wedatasphere.linkis.rpc.Sender; + + +public class WorkflowTaskCopyOperation implements RefCopyOperation { + private DevelopmentService service; + + private final Sender sender = DSSSenderServiceFactory.getOrCreateServiceInstance().getWorkflowSender(); + + @Override + public void setDevelopmentService(DevelopmentService service) { + this.service = service; + } + + @Override + public WorkflowCopyResponseRef copyRef(WorkflowCopyRequestRef workflowCopyRequestRef) { + Long appId = workflowCopyRequestRef.getAppId(); + Long orcVersionId = workflowCopyRequestRef.getOrcVersionId(); + String userName = workflowCopyRequestRef.getUserName(); + String workspaceName = workflowCopyRequestRef.getWorkspaceName(); + String contextIdStr = workflowCopyRequestRef.getContextID(); + String projectName = workflowCopyRequestRef.getProjectName(); + String version = workflowCopyRequestRef.getVersion(); + String description = workflowCopyRequestRef.getDescription(); + RequestCopyWorkflow requestCopyWorkflow = new RequestCopyWorkflow(userName, + workspaceName, appId, contextIdStr, + projectName, orcVersionId, + version, description); + ResponseCopyWorkflow responseCopyWorkflow = (ResponseCopyWorkflow) sender.ask(requestCopyWorkflow); + WorkflowCopyResponseRef workflowCopyResponseRef = new WorkflowCopyResponseRef(); + workflowCopyResponseRef.setDssFlow(responseCopyWorkflow.getDssFlow()); + workflowCopyResponseRef.setCopyTargetAppId(responseCopyWorkflow.getDssFlow().getId()); + return workflowCopyResponseRef; + } +} diff --git a/dss-appconn/appconns/dss-workflow-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/workflow/opertion/WorkflowTaskCreationOperation.java b/dss-appconn/appconns/dss-workflow-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/workflow/opertion/WorkflowTaskCreationOperation.java new file mode 100644 index 000000000..eb649a119 --- /dev/null +++ b/dss-appconn/appconns/dss-workflow-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/workflow/opertion/WorkflowTaskCreationOperation.java @@ -0,0 +1,77 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.workflow.opertion; + +import com.webank.wedatasphere.dss.common.label.DSSLabel; +import com.webank.wedatasphere.dss.common.utils.DSSExceptionUtils; +import com.webank.wedatasphere.dss.orchestrator.common.ref.DefaultOrchestratorCreateRequestRef; +import com.webank.wedatasphere.dss.sender.service.DSSSenderServiceFactory; +import com.webank.wedatasphere.dss.standard.app.development.service.DevelopmentService; +import com.webank.wedatasphere.dss.standard.app.development.operation.RefCreationOperation; +import com.webank.wedatasphere.dss.standard.app.development.ref.CommonResponseRef; +import com.webank.wedatasphere.dss.standard.common.exception.operation.ExternalOperationFailedException; +import com.webank.wedatasphere.dss.workflow.common.protocol.RequestCreateWorkflow; +import com.webank.wedatasphere.dss.workflow.common.protocol.ResponseCreateWorkflow; +import com.webank.wedatasphere.linkis.rpc.Sender; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; + +import java.util.ArrayList; +import java.util.List; + +public class WorkflowTaskCreationOperation implements RefCreationOperation { + private static final Logger LOGGER = LoggerFactory.getLogger(WorkflowTaskCreationOperation.class); + + private Sender sender = DSSSenderServiceFactory.getOrCreateServiceInstance().getWorkflowSender(); + + private DevelopmentService developmentService; + + @Override + public void setDevelopmentService(DevelopmentService service) { + this.developmentService = service; + } + + @Override + public CommonResponseRef createRef(DefaultOrchestratorCreateRequestRef workflowCreateRequestRef) throws ExternalOperationFailedException { + //发送RPC请求 + CommonResponseRef workflowCreateResponseRef = null; + String userName = workflowCreateRequestRef.getUserName(); + String workflowName = workflowCreateRequestRef.getDSSOrchestratorInfo().getName(); + String contextIDStr = workflowCreateRequestRef.getContextID() != null ? + workflowCreateRequestRef.getContextID() : ""; + String description = workflowCreateRequestRef.getDSSOrchestratorInfo().getDesc(); + List dssLabels = workflowCreateRequestRef.getDSSLabels(); + long parentFlowId = -1L; + List linkedAppConnNames = workflowCreateRequestRef.getDSSOrchestratorInfo().getLinkedAppConnNames() != null ? + workflowCreateRequestRef.getDSSOrchestratorInfo().getLinkedAppConnNames() : new ArrayList<>(); + String uses = workflowCreateRequestRef.getDSSOrchestratorInfo().getUses() != null ? + workflowCreateRequestRef.getDSSOrchestratorInfo().getUses() : "uses"; + RequestCreateWorkflow requestCreateWorkflow = new RequestCreateWorkflow(userName, workflowName, + contextIDStr, description, parentFlowId, uses, linkedAppConnNames, dssLabels); + + if (null != sender) { + ResponseCreateWorkflow responseCreateWorkflow = (ResponseCreateWorkflow) sender.ask(requestCreateWorkflow); + workflowCreateResponseRef = new CommonResponseRef(); + workflowCreateResponseRef.setOrcId(responseCreateWorkflow.getDssFlow().getId()); + workflowCreateResponseRef.setContent(responseCreateWorkflow.getDssFlow().getFlowJson()); + } else { + LOGGER.error("dss workflow server dev sender is null, can not send rpc message"); + DSSExceptionUtils.dealErrorException(61123,"dss workflow server dev sender is null", ExternalOperationFailedException.class); + } + return workflowCreateResponseRef; + } +} diff --git a/dss-appconn/appconns/dss-workflow-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/workflow/opertion/WorkflowTaskDeletionOperation.java b/dss-appconn/appconns/dss-workflow-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/workflow/opertion/WorkflowTaskDeletionOperation.java new file mode 100644 index 000000000..79e958e44 --- /dev/null +++ b/dss-appconn/appconns/dss-workflow-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/workflow/opertion/WorkflowTaskDeletionOperation.java @@ -0,0 +1,52 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.workflow.opertion; + +import com.webank.wedatasphere.dss.appconn.workflow.ref.WorkflowDeleteRequestRef; +import com.webank.wedatasphere.dss.common.label.DSSLabel; +import com.webank.wedatasphere.dss.common.protocol.RequestDeleteWorkflow; +import com.webank.wedatasphere.dss.sender.service.DSSSenderServiceFactory; +import com.webank.wedatasphere.dss.standard.app.development.operation.RefDeletionOperation; +import com.webank.wedatasphere.dss.standard.app.development.service.DevelopmentService; +import com.webank.wedatasphere.dss.standard.common.exception.operation.ExternalOperationFailedException; +import com.webank.wedatasphere.linkis.rpc.Sender; +import java.util.List; + + +public class WorkflowTaskDeletionOperation implements RefDeletionOperation { + + private DevelopmentService developmentService; + + @Override + public void deleteRef(WorkflowDeleteRequestRef workflowDeleteRequestRef) throws ExternalOperationFailedException { + String userName = workflowDeleteRequestRef.getUserName(); + Long flowId = workflowDeleteRequestRef.getAppId(); + RequestDeleteWorkflow requestDeleteWorkflow = new RequestDeleteWorkflow(userName, flowId); + List dssLabels = workflowDeleteRequestRef.getDSSLabels(); + Sender tempSend = DSSSenderServiceFactory.getOrCreateServiceInstance().getWorkflowSender(dssLabels); + if (null != tempSend) { + tempSend.ask(requestDeleteWorkflow); + } else { + throw new ExternalOperationFailedException(100036, "Rpc sender is null", null); + } + } + + @Override + public void setDevelopmentService(DevelopmentService service) { + this.developmentService = service; + } +} diff --git a/dss-appconn/appconns/dss-workflow-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/workflow/opertion/WorkflowTaskUpdateOperation.java b/dss-appconn/appconns/dss-workflow-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/workflow/opertion/WorkflowTaskUpdateOperation.java new file mode 100644 index 000000000..09036969f --- /dev/null +++ b/dss-appconn/appconns/dss-workflow-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/workflow/opertion/WorkflowTaskUpdateOperation.java @@ -0,0 +1,59 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.workflow.opertion; + +import com.webank.wedatasphere.dss.appconn.workflow.ref.WorkflowUpdateRequestRef; +import com.webank.wedatasphere.dss.appconn.workflow.ref.WorkflowUpdateResponseRef; +import com.webank.wedatasphere.dss.common.protocol.RequestUpdateWorkflow; +import com.webank.wedatasphere.dss.sender.service.DSSSenderServiceFactory; +import com.webank.wedatasphere.dss.standard.app.development.service.DevelopmentService; +import com.webank.wedatasphere.dss.standard.app.development.operation.RefUpdateOperation; +import com.webank.wedatasphere.dss.standard.common.exception.operation.ExternalOperationFailedException; +import com.webank.wedatasphere.dss.workflow.common.protocol.ResponseUpdateWorkflow; +import com.webank.wedatasphere.linkis.rpc.Sender; + + +public class WorkflowTaskUpdateOperation implements RefUpdateOperation { + + private DevelopmentService developmentService; + + private final Sender sender = DSSSenderServiceFactory.getOrCreateServiceInstance().getWorkflowSender(); + + @Override + public WorkflowUpdateResponseRef updateRef(WorkflowUpdateRequestRef workflowUpdateRequestRef) throws ExternalOperationFailedException { + WorkflowUpdateResponseRef workflowUpdateResponseRef = null; + String userName = workflowUpdateRequestRef.getUserName(); + Long flowID = workflowUpdateRequestRef.getOrcId(); + String flowName = workflowUpdateRequestRef.getOrcName(); + String description = workflowUpdateRequestRef.getDescription(); + String uses = workflowUpdateRequestRef.getUses(); + RequestUpdateWorkflow requestUpdateWorkflow = new RequestUpdateWorkflow(userName, flowID, flowName, description, uses); + if (null != sender) { + ResponseUpdateWorkflow responseUpdateWorkflow = (ResponseUpdateWorkflow) sender.ask(requestUpdateWorkflow); + workflowUpdateResponseRef = new WorkflowUpdateResponseRef(); + workflowUpdateResponseRef.setJobStatus(responseUpdateWorkflow.getJobStatus()); + } else { + throw new ExternalOperationFailedException(100038, "Rpc sender is null", null); + } + return workflowUpdateResponseRef; + } + + @Override + public void setDevelopmentService(DevelopmentService service) { + this.developmentService = service; + } +} diff --git a/dss-appconn/appconns/dss-workflow-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/workflow/ref/WorkflowCopyRequestRef.java b/dss-appconn/appconns/dss-workflow-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/workflow/ref/WorkflowCopyRequestRef.java new file mode 100644 index 000000000..ade4f9e1b --- /dev/null +++ b/dss-appconn/appconns/dss-workflow-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/workflow/ref/WorkflowCopyRequestRef.java @@ -0,0 +1,76 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.workflow.ref; + +import com.webank.wedatasphere.dss.orchestrator.common.ref.OrchestratorCopyRequestRef; +import com.webank.wedatasphere.dss.standard.app.development.ref.impl.CommonRequestRefImpl; + + +public class WorkflowCopyRequestRef extends CommonRequestRefImpl implements OrchestratorCopyRequestRef { + + private Long appId; + private Long orcVersionId; + private String version; + private String description; + + + public String getVersion() { + return version; + } + + public void setVersion(String version) { + this.version = version; + } + + public String getDescription() { + return description; + } + + public void setDescription(String description) { + this.description = description; + } + + public Long getAppId(){ + return this.appId; + } + + + @Override + public boolean equals(Object ref) { + return false; + } + + @Override + public String toString() { + return null; + } + + @Override + public void setCopyOrcAppId(long appId) { + this.appId = appId; + } + + @Override + public void setCopyOrcVersionId(long orcVersionId) { + this.orcVersionId = orcVersionId; + } + + + public Long getOrcVersionId(){ + return orcVersionId; + } +} diff --git a/dss-appconn/appconns/dss-workflow-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/workflow/ref/WorkflowCopyResponseRef.java b/dss-appconn/appconns/dss-workflow-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/workflow/ref/WorkflowCopyResponseRef.java new file mode 100644 index 000000000..94c4db261 --- /dev/null +++ b/dss-appconn/appconns/dss-workflow-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/workflow/ref/WorkflowCopyResponseRef.java @@ -0,0 +1,68 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.workflow.ref; + +import com.webank.wedatasphere.dss.orchestrator.common.ref.OrchestratorCopyResponseRef; +import com.webank.wedatasphere.dss.standard.app.development.ref.CommonResponseRef; +import com.webank.wedatasphere.dss.workflow.common.entity.DSSFlow; + + +public class WorkflowCopyResponseRef extends CommonResponseRef implements OrchestratorCopyResponseRef{ + + private DSSFlow dssFlow; + + private Long copyOrcId; + + private Long copyTargetAppId; + + private String copyTargetContent; + + public DSSFlow getDssFlow() { + return dssFlow; + } + + public void setDssFlow(DSSFlow dssFlow) { + this.dssFlow = dssFlow; + } + + @Override + public long getCopyOrcId() { + return copyOrcId; + } + + public void setCopyOrcId(Long copyOrcId) { + this.copyOrcId = copyOrcId; + } + + @Override + public long getCopyTargetAppId() { + return copyTargetAppId; + } + + public void setCopyTargetAppId(Long copyTargetAppId) { + this.copyTargetAppId = copyTargetAppId; + } + + @Override + public String getCopyTargetContent() { + return copyTargetContent; + } + + public void setCopyTargetContent(String copyTargetContent) { + this.copyTargetContent = copyTargetContent; + } +} diff --git a/dss-appconn/appconns/dss-workflow-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/workflow/ref/WorkflowDeleteRequestRef.java b/dss-appconn/appconns/dss-workflow-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/workflow/ref/WorkflowDeleteRequestRef.java new file mode 100644 index 000000000..cab8d5add --- /dev/null +++ b/dss-appconn/appconns/dss-workflow-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/workflow/ref/WorkflowDeleteRequestRef.java @@ -0,0 +1,37 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.workflow.ref; + +import com.webank.wedatasphere.dss.orchestrator.common.ref.OrchestratorDeleteRequestRef; +import com.webank.wedatasphere.dss.standard.app.development.ref.impl.CommonRequestRefImpl; + + +public class WorkflowDeleteRequestRef extends CommonRequestRefImpl implements OrchestratorDeleteRequestRef { + + private Long appId; + + @Override + public void setAppId(Long appId) { + this.appId = appId; + } + + @Override + public Long getAppId(){ + return this.appId; + } + +} diff --git a/dss-appconn/appconns/dss-workflow-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/workflow/ref/WorkflowExportRequestRef.java b/dss-appconn/appconns/dss-workflow-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/workflow/ref/WorkflowExportRequestRef.java new file mode 100644 index 000000000..aa1be4a12 --- /dev/null +++ b/dss-appconn/appconns/dss-workflow-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/workflow/ref/WorkflowExportRequestRef.java @@ -0,0 +1,58 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.workflow.ref; + +import com.webank.wedatasphere.dss.orchestrator.common.ref.OrchestratorExportRequestRef; +import com.webank.wedatasphere.dss.standard.app.development.ref.impl.CommonRequestRefImpl; + + +public class WorkflowExportRequestRef extends CommonRequestRefImpl implements OrchestratorExportRequestRef { + private Long appId; + private Long orchestratorVersionId; + + @Override + public void setAppId(Long appId) { + this.appId =appId; + } + + @Override + public Long getAppId() { + return appId; + } + + + @Override + public void setOrchestratorVersionId(Long orchestratorVersionId) { + this.orchestratorVersionId = orchestratorVersionId; + } + + @Override + public Long getOrchestratorVersionId() { + return this.orchestratorVersionId; + } + + @Override + public boolean getAddOrcVersionFlag() { + return false; + } + + @Override + public void setAddOrcVersionFlag(boolean addOrcVersion) { + + } + +} diff --git a/dss-appconn/appconns/dss-workflow-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/workflow/ref/WorkflowExportResponseRef.java b/dss-appconn/appconns/dss-workflow-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/workflow/ref/WorkflowExportResponseRef.java new file mode 100644 index 000000000..e540c0384 --- /dev/null +++ b/dss-appconn/appconns/dss-workflow-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/workflow/ref/WorkflowExportResponseRef.java @@ -0,0 +1,54 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.workflow.ref; + +import com.webank.wedatasphere.dss.standard.app.development.ref.CommonResponseRef; + + +public class WorkflowExportResponseRef extends CommonResponseRef { + + private String resourceId; + + private String version; + + private Long flowID; + + public String getResourceId() { + return resourceId; + } + + public void setResourceId(String resourceId) { + this.resourceId = resourceId; + } + + public String getVersion() { + return version; + } + + public void setVersion(String version) { + this.version = version; + } + + public Long getFlowID() { + return flowID; + } + + public void setFlowID(Long flowID) { + this.flowID = flowID; + } + +} diff --git a/dss-appconn/appconns/dss-workflow-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/workflow/ref/WorkflowImportRequestRef.java b/dss-appconn/appconns/dss-workflow-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/workflow/ref/WorkflowImportRequestRef.java new file mode 100644 index 000000000..ac49f17bf --- /dev/null +++ b/dss-appconn/appconns/dss-workflow-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/workflow/ref/WorkflowImportRequestRef.java @@ -0,0 +1,69 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.workflow.ref; + +import com.webank.wedatasphere.dss.common.entity.IOEnv; +import com.webank.wedatasphere.dss.orchestrator.common.ref.OrchestratorImportRequestRef; +import com.webank.wedatasphere.dss.standard.app.development.ref.impl.CommonRequestRefImpl; + + +public class WorkflowImportRequestRef extends CommonRequestRefImpl implements OrchestratorImportRequestRef { + + private String resourceId; + private String bmlVersion; + private IOEnv sourceEnv; + private String orcVersion; + + @Override + public String getResourceId() { + return resourceId; + } + + @Override + public void setResourceId(String resourceId) { + this.resourceId = resourceId; + } + + @Override + public String getBmlVersion() { + return bmlVersion; + } + + @Override + public void setBmlVersion(String bmlVersion) { + this.bmlVersion = bmlVersion; + } + @Override + public IOEnv getSourceEnv() { + return sourceEnv; + } + + @Override + public void setSourceEnv(IOEnv sourceEnv) { + this.sourceEnv = sourceEnv; + } + + @Override + public String getOrcVersion() { + return orcVersion; + } + + @Override + public void setOrcVersion(String orcVersion) { + this.orcVersion = orcVersion; + } +} diff --git a/dss-appconn/appconns/dss-workflow-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/workflow/ref/WorkflowImportResponseRef.java b/dss-appconn/appconns/dss-workflow-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/workflow/ref/WorkflowImportResponseRef.java new file mode 100644 index 000000000..06d140965 --- /dev/null +++ b/dss-appconn/appconns/dss-workflow-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/workflow/ref/WorkflowImportResponseRef.java @@ -0,0 +1,35 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.workflow.ref; + +import com.webank.wedatasphere.dss.common.protocol.JobStatus; +import com.webank.wedatasphere.dss.standard.app.development.ref.CommonResponseRef; + +public class WorkflowImportResponseRef extends CommonResponseRef { + + private JobStatus status; + + public void setStatus(JobStatus status) { + this.status = status; + } + + @Override + public int getStatus() { + return this.status.ordinal(); + } + +} diff --git a/dss-appconn/appconns/dss-workflow-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/workflow/ref/WorkflowUpdateRequestRef.java b/dss-appconn/appconns/dss-workflow-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/workflow/ref/WorkflowUpdateRequestRef.java new file mode 100644 index 000000000..33ebd2fe7 --- /dev/null +++ b/dss-appconn/appconns/dss-workflow-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/workflow/ref/WorkflowUpdateRequestRef.java @@ -0,0 +1,60 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.workflow.ref; + +import com.webank.wedatasphere.dss.orchestrator.common.entity.DSSOrchestratorInfo; +import com.webank.wedatasphere.dss.orchestrator.common.ref.OrchestratorUpdateRef; +import com.webank.wedatasphere.dss.standard.app.development.ref.impl.CommonRequestRefImpl; + + +public class WorkflowUpdateRequestRef extends CommonRequestRefImpl implements OrchestratorUpdateRef { + + private String description; + private String uses; + + @Override + public String getDescription() { + return description; + } + + @Override + public void setDescription(String description) { + this.description = description; + } + + @Override + public String getUses() { + return uses; + } + + @Override + public void setUses(String uses) { + this.uses = uses; + } + + @Override + public DSSOrchestratorInfo getOrchestratorInfo() { + return null; + } + + @Override + public void setOrchestratorInfo(DSSOrchestratorInfo dssOrchestratorInfo) { + + } + + +} diff --git a/dss-appconn/appconns/dss-workflow-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/workflow/ref/WorkflowUpdateResponseRef.java b/dss-appconn/appconns/dss-workflow-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/workflow/ref/WorkflowUpdateResponseRef.java new file mode 100644 index 000000000..0280faa9f --- /dev/null +++ b/dss-appconn/appconns/dss-workflow-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/workflow/ref/WorkflowUpdateResponseRef.java @@ -0,0 +1,34 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.workflow.ref; + +import com.webank.wedatasphere.dss.common.protocol.JobStatus; +import com.webank.wedatasphere.dss.standard.app.development.ref.CommonResponseRef; + + +public class WorkflowUpdateResponseRef extends CommonResponseRef { + private JobStatus jobStatus; + + public JobStatus getJobStatus() { + return jobStatus; + } + + public void setJobStatus(JobStatus jobStatus) { + this.jobStatus = jobStatus; + } + +} diff --git a/dss-appconn/appconns/dss-workflow-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/workflow/ref/WorkflowUrlResponseRef.java b/dss-appconn/appconns/dss-workflow-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/workflow/ref/WorkflowUrlResponseRef.java new file mode 100644 index 000000000..c0ab9f608 --- /dev/null +++ b/dss-appconn/appconns/dss-workflow-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/workflow/ref/WorkflowUrlResponseRef.java @@ -0,0 +1,35 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.workflow.ref; + +import com.webank.wedatasphere.dss.standard.app.development.ref.UrlResponseRef; +import com.webank.wedatasphere.dss.standard.app.development.ref.CommonResponseRef; + +public class WorkflowUrlResponseRef extends CommonResponseRef implements UrlResponseRef { + + private String url; + + @Override + public String getUrl() { + return url; + } + + public void setUrl(String url) { + this.url = url; + } + +} diff --git a/dss-appconn/appconns/dss-workflow-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/workflow/service/WorkflowCRUDService.java b/dss-appconn/appconns/dss-workflow-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/workflow/service/WorkflowCRUDService.java new file mode 100644 index 000000000..56e94ef6c --- /dev/null +++ b/dss-appconn/appconns/dss-workflow-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/workflow/service/WorkflowCRUDService.java @@ -0,0 +1,52 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.workflow.service; + +import com.webank.wedatasphere.dss.appconn.workflow.opertion.WorkflowTaskCopyOperation; +import com.webank.wedatasphere.dss.appconn.workflow.opertion.WorkflowTaskDeletionOperation; +import com.webank.wedatasphere.dss.appconn.workflow.opertion.WorkflowTaskUpdateOperation; +import com.webank.wedatasphere.dss.standard.app.development.operation.RefCopyOperation; +import com.webank.wedatasphere.dss.standard.app.development.operation.RefCreationOperation; +import com.webank.wedatasphere.dss.standard.app.development.operation.RefDeletionOperation; +import com.webank.wedatasphere.dss.standard.app.development.operation.RefUpdateOperation; +import com.webank.wedatasphere.dss.standard.app.development.service.AbstractRefCRUDService; +import com.webank.wedatasphere.dss.appconn.workflow.opertion.WorkflowTaskCreationOperation; + + +public class WorkflowCRUDService extends AbstractRefCRUDService { + + @Override + protected RefCreationOperation createRefCreationOperation() { + return new WorkflowTaskCreationOperation(); + } + + @Override + protected RefCopyOperation createRefCopyOperation() { + return new WorkflowTaskCopyOperation(); + } + + @Override + protected RefUpdateOperation createRefUpdateOperation() { + return new WorkflowTaskUpdateOperation(); + } + + @Override + protected RefDeletionOperation createRefDeletionOperation() { + return new WorkflowTaskDeletionOperation(); + } + +} diff --git a/dss-appconn/appconns/dss-workflow-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/workflow/service/WorkflowExportService.java b/dss-appconn/appconns/dss-workflow-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/workflow/service/WorkflowExportService.java new file mode 100644 index 000000000..29366970f --- /dev/null +++ b/dss-appconn/appconns/dss-workflow-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/workflow/service/WorkflowExportService.java @@ -0,0 +1,32 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.workflow.service; + +import com.webank.wedatasphere.dss.appconn.workflow.opertion.WorkflowRefExportOperation; +import com.webank.wedatasphere.dss.appconn.workflow.ref.WorkflowExportRequestRef; +import com.webank.wedatasphere.dss.standard.app.development.operation.RefExportOperation; +import com.webank.wedatasphere.dss.standard.app.development.service.AbstractRefExportService; + +public class WorkflowExportService extends AbstractRefExportService { + + @Override + @SuppressWarnings("unchecked") + protected RefExportOperation createRefExportOperation() { + return new WorkflowRefExportOperation(); + } + +} diff --git a/dss-appconn/appconns/dss-workflow-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/workflow/service/WorkflowImportService.java b/dss-appconn/appconns/dss-workflow-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/workflow/service/WorkflowImportService.java new file mode 100644 index 000000000..0c6695892 --- /dev/null +++ b/dss-appconn/appconns/dss-workflow-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/workflow/service/WorkflowImportService.java @@ -0,0 +1,34 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.workflow.service; + +import com.webank.wedatasphere.dss.appconn.workflow.opertion.WorkflowRefImportOperation; +import com.webank.wedatasphere.dss.appconn.workflow.ref.WorkflowImportRequestRef; +import com.webank.wedatasphere.dss.appconn.workflow.ref.WorkflowImportResponseRef; +import com.webank.wedatasphere.dss.standard.app.development.service.AbstractRefImportService; +import com.webank.wedatasphere.dss.standard.app.development.operation.RefImportOperation; + +public class WorkflowImportService extends AbstractRefImportService { + + + @Override + @SuppressWarnings("unchecked") + protected RefImportOperation createRefImportOperation() { + return new WorkflowRefImportOperation(); + } + +} diff --git a/dss-appconn/appconns/dss-workflow-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/workflow/service/WorkflowQueryService.java b/dss-appconn/appconns/dss-workflow-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/workflow/service/WorkflowQueryService.java new file mode 100644 index 000000000..cec08b3e9 --- /dev/null +++ b/dss-appconn/appconns/dss-workflow-appconn/src/main/java/com/webank/wedatasphere/dss/appconn/workflow/service/WorkflowQueryService.java @@ -0,0 +1,30 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.workflow.service; + +import com.webank.wedatasphere.dss.appconn.workflow.opertion.WorkflowRefQueryOperation; +import com.webank.wedatasphere.dss.standard.app.development.service.AbstractRefQueryService; +import com.webank.wedatasphere.dss.standard.app.development.operation.RefQueryOperation; + +public class WorkflowQueryService extends AbstractRefQueryService { + + @Override + protected RefQueryOperation createRefQueryOperation() { + return new WorkflowRefQueryOperation(); + } + +} diff --git a/dss-appconn/appconns/dss-workflow-appconn/src/main/resources/appconn.properties b/dss-appconn/appconns/dss-workflow-appconn/src/main/resources/appconn.properties new file mode 100644 index 000000000..19365e2b5 --- /dev/null +++ b/dss-appconn/appconns/dss-workflow-appconn/src/main/resources/appconn.properties @@ -0,0 +1,20 @@ +# +# Copyright 2019 WeBank +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# +# + + + + + diff --git a/dss-appconn/appconns/dss-workflow-appconn/src/main/resources/log4j.properties b/dss-appconn/appconns/dss-workflow-appconn/src/main/resources/log4j.properties new file mode 100644 index 000000000..ee8619595 --- /dev/null +++ b/dss-appconn/appconns/dss-workflow-appconn/src/main/resources/log4j.properties @@ -0,0 +1,36 @@ +# +# Copyright 2019 WeBank +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# +# + +### set log levels ### + +log4j.rootCategory=INFO,console + +log4j.appender.console=org.apache.log4j.ConsoleAppender +log4j.appender.console.Threshold=INFO +log4j.appender.console.layout=org.apache.log4j.PatternLayout +#log4j.appender.console.layout.ConversionPattern= %d{ISO8601} %-5p (%t) [%F:%M(%L)] - %m%n +log4j.appender.console.layout.ConversionPattern= %d{ISO8601} %-5p (%t) %p %c{1} - %m%n + + +log4j.appender.com.webank.bdp.ide.core=org.apache.log4j.DailyRollingFileAppender +log4j.appender.com.webank.bdp.ide.core.Threshold=INFO +log4j.additivity.com.webank.bdp.ide.core=false +log4j.appender.com.webank.bdp.ide.core.layout=org.apache.log4j.PatternLayout +log4j.appender.com.webank.bdp.ide.core.Append=true +log4j.appender.com.webank.bdp.ide.core.File=logs/linkis.log +log4j.appender.com.webank.bdp.ide.core.layout.ConversionPattern= %d{ISO8601} %-5p (%t) [%F:%M(%L)] - %m%n + +log4j.logger.org.springframework=INFO diff --git a/dss-appconn/appconns/dss-workflow-appconn/src/main/resources/log4j2.xml b/dss-appconn/appconns/dss-workflow-appconn/src/main/resources/log4j2.xml new file mode 100644 index 000000000..8c40a73e8 --- /dev/null +++ b/dss-appconn/appconns/dss-workflow-appconn/src/main/resources/log4j2.xml @@ -0,0 +1,38 @@ + + + + + + + + + + + + + + + + + + + + + + + diff --git a/dss-appconn/dss-appconn-core/pom.xml b/dss-appconn/dss-appconn-core/pom.xml new file mode 100644 index 000000000..e54fc31a6 --- /dev/null +++ b/dss-appconn/dss-appconn-core/pom.xml @@ -0,0 +1,106 @@ + + + + + + dss + com.webank.wedatasphere.dss + 1.0.0 + + 4.0.0 + + dss-appconn-core + + + 2.11.0 + 3.2.2 + + + + + com.webank.wedatasphere.dss + dss-standard-common + ${dss.version} + + + + + com.webank.wedatasphere.dss + dss-sso-integration-standard + ${dss.version} + + + + com.webank.wedatasphere.dss + dss-structure-integration-standard + ${dss.version} + + + + com.webank.wedatasphere.dss + dss-development-process-standard + ${dss.version} + + + + commons-collections + commons-collections + ${commons-collections.version} + + + com.fasterxml.jackson.core + jackson-databind + ${fasterxml.jackson.version} + provided + + + com.fasterxml.jackson.core + jackson-annotations + ${fasterxml.jackson.version} + provided + + + com.webank.wedatasphere.linkis + linkis-common + ${linkis.version} + provided + + + + com.webank.wedatasphere.dss + dss-common + ${dss.version} + provided + + + com.webank.wedatasphere.linkis + linkis-module + ${linkis.version} + provided + + + com.webank.wedatasphere.dss + dss-origin-sso-integration-standard + ${dss.version} + + + + + + \ No newline at end of file diff --git a/dss-appconn/dss-appconn-core/src/main/java/com/webank/wedatasphere/dss/appconn/core/AppConn.java b/dss-appconn/dss-appconn-core/src/main/java/com/webank/wedatasphere/dss/appconn/core/AppConn.java new file mode 100644 index 000000000..add899c08 --- /dev/null +++ b/dss-appconn/dss-appconn-core/src/main/java/com/webank/wedatasphere/dss/appconn/core/AppConn.java @@ -0,0 +1,40 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.core; + +import com.webank.wedatasphere.dss.appconn.core.exception.AppConnErrorException; +import com.webank.wedatasphere.dss.standard.common.core.AppStandard; +import com.webank.wedatasphere.dss.standard.common.desc.AppDesc; + +import java.util.List; + +public interface AppConn { + + void init() throws AppConnErrorException; + + /** + * 1. Get the dssappconnbean table record + * 2. Do a traversal to get all appinstances under each appconn + * 3. Instantiate the real appconn interface + */ + List getAppStandards(); + + AppDesc getAppDesc(); + + void setAppDesc(AppDesc appDesc); + +} diff --git a/dss-appconn/dss-appconn-core/src/main/java/com/webank/wedatasphere/dss/appconn/core/exception/AppConnErrorException.java b/dss-appconn/dss-appconn-core/src/main/java/com/webank/wedatasphere/dss/appconn/core/exception/AppConnErrorException.java new file mode 100644 index 000000000..129260c77 --- /dev/null +++ b/dss-appconn/dss-appconn-core/src/main/java/com/webank/wedatasphere/dss/appconn/core/exception/AppConnErrorException.java @@ -0,0 +1,32 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.core.exception; + + +import com.webank.wedatasphere.linkis.common.exception.ErrorException; + +public class AppConnErrorException extends ErrorException { + + public AppConnErrorException(int errorCode, String errorDesc){ + super(errorCode, errorDesc); + } + public AppConnErrorException(int errorCode, String errorDesc, Throwable cause){ + super(errorCode, errorDesc); + initCause(cause); + } + +} diff --git a/dss-appconn/dss-appconn-core/src/main/java/com/webank/wedatasphere/dss/appconn/core/exception/AppConnWarnException.java b/dss-appconn/dss-appconn-core/src/main/java/com/webank/wedatasphere/dss/appconn/core/exception/AppConnWarnException.java new file mode 100644 index 000000000..2545d0ada --- /dev/null +++ b/dss-appconn/dss-appconn-core/src/main/java/com/webank/wedatasphere/dss/appconn/core/exception/AppConnWarnException.java @@ -0,0 +1,31 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.core.exception; + +import com.webank.wedatasphere.linkis.common.exception.WarnException; + +public class AppConnWarnException extends WarnException { + + public AppConnWarnException(int errCode, String desc) { + super(errCode, desc); + } + + public AppConnWarnException(int errCode, String desc, Throwable cause) { + super(errCode, desc); + initCause(cause); + } +} diff --git a/dss-appconn/dss-appconn-core/src/main/java/com/webank/wedatasphere/dss/appconn/core/ext/OnlyDevelopmentAppConn.java b/dss-appconn/dss-appconn-core/src/main/java/com/webank/wedatasphere/dss/appconn/core/ext/OnlyDevelopmentAppConn.java new file mode 100644 index 000000000..fadc9f02a --- /dev/null +++ b/dss-appconn/dss-appconn-core/src/main/java/com/webank/wedatasphere/dss/appconn/core/ext/OnlyDevelopmentAppConn.java @@ -0,0 +1,30 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.core.ext; + + +import com.webank.wedatasphere.dss.appconn.core.AppConn; +import com.webank.wedatasphere.dss.standard.app.development.standard.DevelopmentIntegrationStandard; + +/** + * Only the third level standard + * */ +public interface OnlyDevelopmentAppConn extends AppConn { + + DevelopmentIntegrationStandard getOrCreateDevelopmentStandard(); + +} diff --git a/dss-appconn/dss-appconn-core/src/main/java/com/webank/wedatasphere/dss/appconn/core/ext/OnlySSOAppConn.java b/dss-appconn/dss-appconn-core/src/main/java/com/webank/wedatasphere/dss/appconn/core/ext/OnlySSOAppConn.java new file mode 100644 index 000000000..97f17abd1 --- /dev/null +++ b/dss-appconn/dss-appconn-core/src/main/java/com/webank/wedatasphere/dss/appconn/core/ext/OnlySSOAppConn.java @@ -0,0 +1,26 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.core.ext; + +import com.webank.wedatasphere.dss.appconn.core.AppConn; +import com.webank.wedatasphere.dss.standard.app.sso.SSOIntegrationStandard; + +public interface OnlySSOAppConn extends AppConn { + + SSOIntegrationStandard getOrCreateSSOStandard(); + +} diff --git a/dss-appconn/dss-appconn-core/src/main/java/com/webank/wedatasphere/dss/appconn/core/ext/OnlyStructureAppConn.java b/dss-appconn/dss-appconn-core/src/main/java/com/webank/wedatasphere/dss/appconn/core/ext/OnlyStructureAppConn.java new file mode 100644 index 000000000..714c835d2 --- /dev/null +++ b/dss-appconn/dss-appconn-core/src/main/java/com/webank/wedatasphere/dss/appconn/core/ext/OnlyStructureAppConn.java @@ -0,0 +1,26 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.core.ext; + +import com.webank.wedatasphere.dss.appconn.core.AppConn; +import com.webank.wedatasphere.dss.standard.app.structure.StructureIntegrationStandard; + +public interface OnlyStructureAppConn extends AppConn { + + StructureIntegrationStandard getOrCreateStructureStandard(); + +} diff --git a/dss-appconn/dss-appconn-core/src/main/java/com/webank/wedatasphere/dss/appconn/core/ext/SecondlyAppConn.java b/dss-appconn/dss-appconn-core/src/main/java/com/webank/wedatasphere/dss/appconn/core/ext/SecondlyAppConn.java new file mode 100644 index 000000000..9405f1926 --- /dev/null +++ b/dss-appconn/dss-appconn-core/src/main/java/com/webank/wedatasphere/dss/appconn/core/ext/SecondlyAppConn.java @@ -0,0 +1,22 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.core.ext; + + +public interface SecondlyAppConn extends OnlySSOAppConn, OnlyStructureAppConn { + +} \ No newline at end of file diff --git a/dss-appconn/dss-appconn-core/src/main/java/com/webank/wedatasphere/dss/appconn/core/ext/ThirdlyAppConn.java b/dss-appconn/dss-appconn-core/src/main/java/com/webank/wedatasphere/dss/appconn/core/ext/ThirdlyAppConn.java new file mode 100644 index 000000000..d73396949 --- /dev/null +++ b/dss-appconn/dss-appconn-core/src/main/java/com/webank/wedatasphere/dss/appconn/core/ext/ThirdlyAppConn.java @@ -0,0 +1,23 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.core.ext; + +/** + * Including the first, second and third level standards + * */ +public interface ThirdlyAppConn extends OnlySSOAppConn, OnlyStructureAppConn, OnlyDevelopmentAppConn{ +} diff --git a/dss-appconn/dss-appconn-core/src/main/java/com/webank/wedatasphere/dss/appconn/core/impl/AbstractAppConn.java b/dss-appconn/dss-appconn-core/src/main/java/com/webank/wedatasphere/dss/appconn/core/impl/AbstractAppConn.java new file mode 100644 index 000000000..062ea5024 --- /dev/null +++ b/dss-appconn/dss-appconn-core/src/main/java/com/webank/wedatasphere/dss/appconn/core/impl/AbstractAppConn.java @@ -0,0 +1,106 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.core.impl; + + +import com.webank.wedatasphere.dss.appconn.core.AppConn; +import com.webank.wedatasphere.dss.appconn.core.exception.AppConnErrorException; +import com.webank.wedatasphere.dss.appconn.core.exception.AppConnWarnException; +import com.webank.wedatasphere.dss.appconn.core.ext.OnlySSOAppConn; +import com.webank.wedatasphere.dss.standard.app.sso.request.SSORequestService; +import com.webank.wedatasphere.dss.standard.common.core.AppIntegrationStandard; +import com.webank.wedatasphere.dss.standard.common.core.AppStandard; +import com.webank.wedatasphere.dss.standard.common.desc.AppDesc; +import com.webank.wedatasphere.dss.standard.common.exception.AppStandardErrorException; +import java.util.Arrays; +import java.util.List; +import java.util.Objects; +import java.util.stream.Collectors; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; + + +public abstract class AbstractAppConn implements AppConn { + + private AppDesc appDesc; + private List appStandards; + protected final List appStandardMethodHeader = Arrays.asList("create", "getOrCreate", "get"); + protected final Logger logger = LoggerFactory.getLogger(getClass()); + + @Override + public List getAppStandards() { + if(appStandards == null) { + synchronized (appStandardMethodHeader) { + if(appStandards == null) { + try { + init(); + } catch (AppConnErrorException e) { + throw new AppConnWarnException(e.getErrCode(), e.getMessage(), e); + } + } + } + } + return appStandards; + } + + protected abstract void initialize(); + + /** + * Specification: each appconn needs to define a method starting with create + * and returning as standard type to initialize the specifications owned by appconn. + * */ + @Override + public final void init() throws AppConnErrorException { + initialize(); + appStandards = Arrays.stream(getClass().getDeclaredMethods()).map(method -> { + String methodName = method.getName(); + if(appStandardMethodHeader.stream().anyMatch(methodName::startsWith) && + AppStandard.class.isAssignableFrom(method.getReturnType())) { + try { + return (AppStandard) method.invoke(this); + } catch (ReflectiveOperationException e) { + logger.warn(methodName + " execute failed, ignore to set it into appStandardList of " + getClass().getSimpleName(), e); + } + } + return null; + }).filter(Objects::nonNull).collect(Collectors.toList()); + if(this instanceof OnlySSOAppConn) { + SSORequestService ssoRequestService = ((OnlySSOAppConn) this).getOrCreateSSOStandard().getSSORequestService(); + for(AppStandard appStandard : appStandards) { + if(appStandard instanceof AppIntegrationStandard) { + ((AppIntegrationStandard) appStandard).setSSORequestService(ssoRequestService); + } + try { + appStandard.init(); + } catch (AppStandardErrorException e) { + throw new AppConnErrorException(e.getErrCode(), "Init " + appStandard.getStandardName() + " failed!", e); + } + } + } + } + + @Override + public AppDesc getAppDesc() { + return appDesc; + } + + @Override + public void setAppDesc(AppDesc appDesc) { + this.appDesc = appDesc; + } + +} diff --git a/dss-appconn/dss-appconn-core/src/main/java/com/webank/wedatasphere/dss/appconn/core/impl/AbstractOnlySSOAppConn.java b/dss-appconn/dss-appconn-core/src/main/java/com/webank/wedatasphere/dss/appconn/core/impl/AbstractOnlySSOAppConn.java new file mode 100644 index 000000000..53ecf8611 --- /dev/null +++ b/dss-appconn/dss-appconn-core/src/main/java/com/webank/wedatasphere/dss/appconn/core/impl/AbstractOnlySSOAppConn.java @@ -0,0 +1,38 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.core.impl; + +import com.webank.wedatasphere.dss.appconn.core.ext.OnlySSOAppConn; +import com.webank.wedatasphere.dss.common.utils.ClassUtils; +import com.webank.wedatasphere.dss.standard.app.sso.SSOIntegrationStandard; +import com.webank.wedatasphere.dss.standard.app.sso.SSOIntegrationStandardFactory; + +public abstract class AbstractOnlySSOAppConn extends AbstractAppConn implements OnlySSOAppConn { + + private static final SSOIntegrationStandard SSO_INTEGRATION_STANDARD; + + static { + SSOIntegrationStandardFactory ssoIntegrationStandardFactory = ClassUtils.getInstanceOrWarn(SSOIntegrationStandardFactory.class); + ssoIntegrationStandardFactory.init(); + SSO_INTEGRATION_STANDARD = ssoIntegrationStandardFactory.getSSOIntegrationStandard(); + } + + @Override + public final SSOIntegrationStandard getOrCreateSSOStandard() { + return SSO_INTEGRATION_STANDARD; + } +} diff --git a/dss-appconn/dss-appconn-loader/pom.xml b/dss-appconn/dss-appconn-loader/pom.xml new file mode 100644 index 000000000..db9b9e823 --- /dev/null +++ b/dss-appconn/dss-appconn-loader/pom.xml @@ -0,0 +1,86 @@ + + + + + + dss + com.webank.wedatasphere.dss + 1.0.0 + + 4.0.0 + + dss-appconn-loader + + + + + + com.webank.wedatasphere.dss + dss-appconn-core + ${dss.version} + + + + com.webank.wedatasphere.linkis + linkis-common + ${linkis.version} + provided + + + + com.webank.wedatasphere.dss + dss-common + ${dss.version} + provided + + + + + + + + org.apache.maven.plugins + maven-deploy-plugin + + + + net.alchim31.maven + scala-maven-plugin + + + org.apache.maven.plugins + maven-jar-plugin + + + **/*.yml + **/*.properties + **/*.sh + **/log4j2.xml + + + + + + + src/main/resources + + + + + \ No newline at end of file diff --git a/dss-appconn/dss-appconn-loader/src/main/java/com/webank/wedatasphere/dss/appconn/loader/clazzloader/AppConnClassLoader.java b/dss-appconn/dss-appconn-loader/src/main/java/com/webank/wedatasphere/dss/appconn/loader/clazzloader/AppConnClassLoader.java new file mode 100644 index 000000000..7f2c60882 --- /dev/null +++ b/dss-appconn/dss-appconn-loader/src/main/java/com/webank/wedatasphere/dss/appconn/loader/clazzloader/AppConnClassLoader.java @@ -0,0 +1,40 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.loader.clazzloader; + +import java.net.URL; +import java.net.URLClassLoader; + +/** + * use UrlClassLoader load Jar package to jvm. + * */ +public class AppConnClassLoader extends URLClassLoader { + + public AppConnClassLoader(URL[] urls, ClassLoader parent) { + super(urls, parent); + } + + @Override + public Class> loadClass(String name) throws ClassNotFoundException { + return loadClass(name, false); + } + + @Override + protected Class> loadClass(String name, boolean resolve) throws ClassNotFoundException { + return super.loadClass(name, resolve); + } +} \ No newline at end of file diff --git a/dss-appconn/dss-appconn-loader/src/main/java/com/webank/wedatasphere/dss/appconn/loader/loader/AppConnLoader.java b/dss-appconn/dss-appconn-loader/src/main/java/com/webank/wedatasphere/dss/appconn/loader/loader/AppConnLoader.java new file mode 100644 index 000000000..95df46e5c --- /dev/null +++ b/dss-appconn/dss-appconn-loader/src/main/java/com/webank/wedatasphere/dss/appconn/loader/loader/AppConnLoader.java @@ -0,0 +1,29 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.loader.loader; + + +import com.webank.wedatasphere.dss.appconn.core.AppConn; + +/** + * Load interface specification of appconn + * */ +public interface AppConnLoader { + + AppConn getAppConn(String appConnName, String spi, String homePath) throws Exception; + +} \ No newline at end of file diff --git a/dss-appconn/dss-appconn-loader/src/main/java/com/webank/wedatasphere/dss/appconn/loader/loader/AppConnLoaderFactory.java b/dss-appconn/dss-appconn-loader/src/main/java/com/webank/wedatasphere/dss/appconn/loader/loader/AppConnLoaderFactory.java new file mode 100644 index 000000000..c84f592c3 --- /dev/null +++ b/dss-appconn/dss-appconn-loader/src/main/java/com/webank/wedatasphere/dss/appconn/loader/loader/AppConnLoaderFactory.java @@ -0,0 +1,59 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.loader.loader; + + +import com.webank.wedatasphere.dss.appconn.loader.conf.AppConnLoaderConf; +import org.apache.commons.lang.ClassUtils; +import org.apache.commons.lang.StringUtils; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; + +public class AppConnLoaderFactory { + + private static final Logger logger = LoggerFactory.getLogger(AppConnLoaderFactory.class); + + private static Class extends AppConnLoader> clazz = CommonAppConnLoader.class; + private static AppConnLoader appConnLoader = null; + + @SuppressWarnings("unchecked") + public static AppConnLoader getAppConnLoader(){ + if (appConnLoader == null){ + synchronized (AppConnLoaderFactory.class){ + if (appConnLoader == null){ + // The corresponding classes can be loaded by configuration + String className = AppConnLoaderConf.CLASS_LOADER_CLASS_NAME().getValue(); + if (StringUtils.isNotBlank(className)){ + try{ + clazz = ClassUtils.getClass(className); + }catch(ClassNotFoundException e){ + logger.warn(String.format("Can not get AppConnLoader class %s, CommonAppConnLoader will be used by default.", className), e); + } + } + try { + appConnLoader = clazz.newInstance(); + } catch (Exception e) { + logger.error(String.format("Can not initialize AppConnLoader class %s.", clazz.getSimpleName()), e); + } + logger.info("Use {} to load all AppConns.", clazz.getSimpleName()); + } + } + } + return appConnLoader; + } + +} diff --git a/dss-appconn/dss-appconn-loader/src/main/java/com/webank/wedatasphere/dss/appconn/loader/loader/CommonAppConnLoader.java b/dss-appconn/dss-appconn-loader/src/main/java/com/webank/wedatasphere/dss/appconn/loader/loader/CommonAppConnLoader.java new file mode 100644 index 000000000..e47a273a0 --- /dev/null +++ b/dss-appconn/dss-appconn-loader/src/main/java/com/webank/wedatasphere/dss/appconn/loader/loader/CommonAppConnLoader.java @@ -0,0 +1,82 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.loader.loader; + + +import com.webank.wedatasphere.dss.appconn.core.AppConn; +import com.webank.wedatasphere.dss.appconn.loader.clazzloader.AppConnClassLoader; +import com.webank.wedatasphere.dss.appconn.loader.exception.NoSuchAppConnException; +import com.webank.wedatasphere.dss.appconn.loader.utils.AppConnUtils; +import com.webank.wedatasphere.dss.common.utils.DSSExceptionUtils; +import com.webank.wedatasphere.dss.standard.common.utils.AppStandardClassUtils; +import com.webank.wedatasphere.linkis.common.exception.ErrorException; +import java.io.File; +import java.net.URL; +import java.nio.file.Paths; +import java.util.List; +import org.apache.commons.lang.StringUtils; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; + +public class CommonAppConnLoader implements AppConnLoader { + + private static final String LIB_NAME = "lib"; + + private static final Logger LOGGER = LoggerFactory.getLogger(CommonAppConnLoader.class); + + @Override + public AppConn getAppConn(String appConnName, String spi, String homePath) throws Exception { + ClassLoader currentClassLoader = Thread.currentThread().getContextClassLoader(); + String libPathUrl; + if (StringUtils.isNotEmpty(homePath)){ + libPathUrl = new File(homePath, LIB_NAME).getPath(); + } else { + libPathUrl = Paths.get(AppConnUtils.getAppConnHomePath(), appConnName, LIB_NAME).toFile().getPath(); + } + LOGGER.info("The libPath url of AppConn {} is {}.", appConnName, libPathUrl); + List jars = AppConnUtils.getJarsUrlsOfPath(libPathUrl); + ClassLoader classLoader = AppStandardClassUtils.getClassLoader(appConnName, () -> new AppConnClassLoader(jars.toArray(new URL[1]), currentClassLoader)); + Thread.currentThread().setContextClassLoader(classLoader); + String fullClassName; + if (StringUtils.isEmpty(spi)) { + try { + fullClassName = AppConnUtils.getAppConnClassName(appConnName, libPathUrl, classLoader); + } catch (NoSuchAppConnException e) { + Thread.currentThread().setContextClassLoader(currentClassLoader); + throw e; + } + } else { + fullClassName = spi; + } + Class> clazz = null; + try { + clazz = classLoader.loadClass(fullClassName); + } catch (ClassNotFoundException e) { + Thread.currentThread().setContextClassLoader(currentClassLoader); + DSSExceptionUtils.dealErrorException(70062, fullClassName + " class not found ", e, ErrorException.class); + } + if (clazz == null) { + Thread.currentThread().setContextClassLoader(currentClassLoader); + return null; + } else { + AppConn retAppConn = (AppConn) clazz.newInstance(); + Thread.currentThread().setContextClassLoader(currentClassLoader); + LOGGER.info("AppConn is {}, retAppConn is {}.", appConnName, retAppConn.getClass().getName()); + return retAppConn; + } + } +} diff --git a/dss-appconn/dss-appconn-loader/src/main/java/com/webank/wedatasphere/dss/appconn/loader/utils/AppConnUtils.java b/dss-appconn/dss-appconn-loader/src/main/java/com/webank/wedatasphere/dss/appconn/loader/utils/AppConnUtils.java new file mode 100644 index 000000000..db12d3d1b --- /dev/null +++ b/dss-appconn/dss-appconn-loader/src/main/java/com/webank/wedatasphere/dss/appconn/loader/utils/AppConnUtils.java @@ -0,0 +1,137 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.loader.utils; + +import com.webank.wedatasphere.dss.appconn.core.AppConn; +import com.webank.wedatasphere.dss.appconn.loader.exception.NoSuchAppConnException; +import com.webank.wedatasphere.dss.common.utils.DSSCommonUtils; +import com.webank.wedatasphere.linkis.common.conf.CommonVars; +import java.io.File; +import java.io.IOException; +import java.lang.reflect.Modifier; +import java.net.MalformedURLException; +import java.net.URL; +import java.util.ArrayList; +import java.util.Enumeration; +import java.util.List; +import java.util.jar.JarEntry; +import java.util.jar.JarFile; +import org.apache.commons.lang.StringUtils; + +public class AppConnUtils { + + public static final String JAR_SUF_NAME = ".jar"; + + public static final String APPCONN_DIR_NAME = "dss-appconns"; + + public static final CommonVars APPCONN_HOME_PATH = CommonVars.apply("wds.dss.appconn.home.path", + new File(DSSCommonUtils.DSS_HOME.getValue(), APPCONN_DIR_NAME).getPath()); + + public static String getAppConnHomePath() { + return APPCONN_HOME_PATH.acquireNew(); + } + + /** + * Obtain the fully qualified name of the appconn to be instantiated. + * */ + public static String getAppConnClassName(String appConnName, String libPath, + ClassLoader classLoader) throws NoSuchAppConnException, IOException { + //1.Get all the jar packages under the directory + List jars = getJarsOfPath(libPath); + //2.Get the subclass of appconn from all jars + for (String jar : jars) { + for (String clazzName : getClassNameFrom(jar)) { + //3.Then find the subclass of appconn in the corresponding jar package + if (isChildClass(clazzName, AppConn.class, classLoader)) { + return clazzName; + } + } + } + throw new NoSuchAppConnException("Cannot find a appConn instance for AppConn " + appConnName + " in lib path " + libPath); + } + + public static List getJarsOfPath(String path) { + File file = new File(path); + List jars = new ArrayList<>(); + if (file.listFiles() != null) { + for (File f : file.listFiles()) { + // only search from dss-xxxxx.jar. + if (!f.isDirectory() && f.getName().endsWith(JAR_SUF_NAME) && f.getName().startsWith("dss-")) { + jars.add(f.getPath()); + } + } + } + return jars; + } + + + public static List getJarsUrlsOfPath(String path) throws MalformedURLException { + File file = new File(path); + List jars = new ArrayList<>(); + if (file.listFiles() != null) { + for (File f : file.listFiles()) { + if (!f.isDirectory() && f.getName().endsWith(JAR_SUF_NAME)) { + jars.add(f.toURI().toURL()); + } + } + } + return jars; + } + + + /** + * Then look for the subclass of appconn in the corresponding jar package, + * and read all the class file names from the jar package. + */ + private static List getClassNameFrom(String jarName) throws IOException { + List fileList = new ArrayList<>(); + JarFile jarFile = new JarFile(new File(jarName)); + Enumeration en = jarFile.entries(); + while (en.hasMoreElements()) { + String name1 = en.nextElement().getName(); + if (!name1.endsWith(".class")) { + continue; + } + String name2 = name1.substring(0, name1.lastIndexOf(".class")); + String name3 = name2.replaceAll("/", "."); + fileList.add(name3); + } + return fileList; + } + + + private static boolean isChildClass(String className, Class parentClazz, ClassLoader classLoader) { + if (StringUtils.isEmpty(className)) { + return false; + } + Class clazz = null; + try { + clazz = classLoader.loadClass(className); + //忽略抽象类和接口 + if (Modifier.isAbstract(clazz.getModifiers())) { + return false; + } + if (Modifier.isInterface(clazz.getModifiers())) { + return false; + } + } catch (Throwable t) { + return false; + } + return parentClazz.isAssignableFrom(clazz); + } + +} diff --git a/dss-appjoint-loader/src/main/java/com/webank/wedatasphere/dss/appjoint/utils/ExceptionHelper.java b/dss-appconn/dss-appconn-loader/src/main/java/com/webank/wedatasphere/dss/appconn/loader/utils/ExceptionHelper.java similarity index 82% rename from dss-appjoint-loader/src/main/java/com/webank/wedatasphere/dss/appjoint/utils/ExceptionHelper.java rename to dss-appconn/dss-appconn-loader/src/main/java/com/webank/wedatasphere/dss/appconn/loader/utils/ExceptionHelper.java index 39a80227a..69f92a451 100644 --- a/dss-appjoint-loader/src/main/java/com/webank/wedatasphere/dss/appjoint/utils/ExceptionHelper.java +++ b/dss-appconn/dss-appconn-loader/src/main/java/com/webank/wedatasphere/dss/appconn/loader/utils/ExceptionHelper.java @@ -1,8 +1,7 @@ /* * Copyright 2019 WeBank - * * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. + * you may not use this file except in compliance with the License. * You may obtain a copy of the License at * * http://www.apache.org/licenses/LICENSE-2.0 @@ -15,14 +14,10 @@ * */ -package com.webank.wedatasphere.dss.appjoint.utils; +package com.webank.wedatasphere.dss.appconn.loader.utils; import com.webank.wedatasphere.linkis.common.exception.ErrorException; -/** - * created by cooperyang on 2019/11/10 - * Description: - */ public class ExceptionHelper { public static void dealErrorException(int errorCode, String errorMsg, Throwable t) throws ErrorException { ErrorException errorException = new ErrorException(errorCode, errorMsg); diff --git a/dss-appconn/dss-appconn-loader/src/main/scala/com/webank/wedatasphere/dss/appconn/loader/conf/AppConnLoaderConf.scala b/dss-appconn/dss-appconn-loader/src/main/scala/com/webank/wedatasphere/dss/appconn/loader/conf/AppConnLoaderConf.scala new file mode 100644 index 000000000..9c4942c2a --- /dev/null +++ b/dss-appconn/dss-appconn-loader/src/main/scala/com/webank/wedatasphere/dss/appconn/loader/conf/AppConnLoaderConf.scala @@ -0,0 +1,23 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.loader.conf + +import com.webank.wedatasphere.linkis.common.conf.CommonVars + +object AppConnLoaderConf { + val CLASS_LOADER_CLASS_NAME = CommonVars("dss.appconn.loader.classname", "") +} diff --git a/dss-appconn/dss-appconn-loader/src/main/scala/com/webank/wedatasphere/dss/appconn/loader/exception/NoSuchAppConnException.scala b/dss-appconn/dss-appconn-loader/src/main/scala/com/webank/wedatasphere/dss/appconn/loader/exception/NoSuchAppConnException.scala new file mode 100644 index 000000000..a590d77a3 --- /dev/null +++ b/dss-appconn/dss-appconn-loader/src/main/scala/com/webank/wedatasphere/dss/appconn/loader/exception/NoSuchAppConnException.scala @@ -0,0 +1,22 @@ +/* + * Copyright 2019 WeBank + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.appconn.loader.exception + +import com.webank.wedatasphere.linkis.common.exception.ErrorException + +case class NoSuchAppConnException(errDesc:String) extends ErrorException(70059, errDesc) + diff --git a/dss-appconn/dss-appconn-manager/dss-appconn-manager-client/pom.xml b/dss-appconn/dss-appconn-manager/dss-appconn-manager-client/pom.xml new file mode 100644 index 000000000..fcbddd097 --- /dev/null +++ b/dss-appconn/dss-appconn-manager/dss-appconn-manager-client/pom.xml @@ -0,0 +1,88 @@ + + + +