Unit 2: Initializing the OU


Congratulations! Now that you have logged in to the EnOS Management Console, you can now initialize your OU to make it ready for your project with the OU profile and security settings, user and permission management, environment information, resource planning, and purchasing applications.

Managing the OU Profile and Security Settings

After logging in to the EnOS Management Console with the administrator account, you can go to the IAM > Organization Profile page to edit the basic information of your OU and change your user name as needed. For more information, see Managing the Organization Information and Organization Accounts.


To change the security settings for your organization, go to the IAM > Security Setting page and update the password policy, login IP restrictions, and session expiration time as needed. For more information, see Setting the Security Options.

Managing Users and Permissions

To enable collaboration with multiple users, you can create 3 user account types: ordinary users (internal users within the OU), external users (imported from another OU), and LDAP users. To secure your assets and data, EnOS enforces the access control of users at different levels:

  • The service level - a user would need to request proper access from the OU administrator to be able to read, write, or control objects in a service.
  • The asset level - a user can only access the assets that are authorized to the user.


To create user accounts and assign access permission for users, go to the IAM > User page. For information about how to manage users and user permissions, see Creating and Managing Users.

Getting Environment Information

To connect your devices to EnOS through MQTT, consume the subscribed asset data, or invoke API service, you will need to get the EnOS environment information.


The EnOS environment information varies with the cloud region and instance where EnOS is deployed. Log in to the EnOS Management Console and go to Help > Environment Information in the upper right corner to get the environment information.

Planning Resource

Before you start your project on EnOS, you will first need to evaluate your volume of devices, data to be uploaded into EnOS, and the computing scale you need. This would help you determine how much computing and storage resource you will need to request to support your project.


Based on your business requirements, you can plan for the following resources.

Resource Name Function Description
Device File Storage IoT Hub enables users to store the measurement point data of connected devices with the device file storage resource. A 0.1TB storage space is allocated by default when an OU is created. By estimating the number of connected devices and device file size, you can decide how much device file storage you will need to request.
Device Integration The Device Integration Service provides users with an environment to create, schedule, automate, and manage integration flows for integrating devices across enterprises and organizations. In order to deploy the created integration flows, you need to request for the Device Integration resource.
Protocol Gateway Protocol gateways in Device Onboarding enables users to use third-party system protocols to access EnOS for dynamic modeling and asset management in IoT Hub. Before you can start a protocol gateway, you need to request for the Protocol Gateway resource.
Data Catalog Through Data Catalog, you can create new measurement points and query metadata information about measurement points, including basic information about measurement points, storage policy information, data archiving task information, and associated instance lists. Before creating a new observation point and inquiring metadata information of the observation point, it is necessary to apply for the Data Catalog resource.
Data Federation Before creating data federation channels, you need to request the Data Federation resource. Different resource specifications correspond to different data querying and writing capabilities. Each resource can be associated with one channel only at one time.
Time Series Database To store time series data that is ingested from devices for processing or application development, you need to request the Time Series Database resource, which includes write capacity and storage space.
Stream Processing The specification of the Stream Data Processing resource is defined by the calculation capacity of the streaming engine, which refers to the number of data points that can be processed every second. By estimating the number of connected devices and measurement points, you can decide how much computing resource you will need to request.
Stream Processing - Message Queue For advanced stream processing pipelines, the system pipeline outputs data records that are needed by the pipelines to different Kafka topics. Before creating an advanced stream processing pipeline, you need to request for the Stream Processing - Message Queue resource.
Data Archiving Before archiving asset data from either the real-time message channel or offline message channel to the target storage, you need to request the Data Archiving resource.
Batch Processing - Queue Before using the Data Sandbox notebook for running offline data analytics tasks, you need to request the Batch Processing - Queue resource. If the tasks require higher CPU usage, choose the Computing-Intensive specification. If the tasks require higher memory usage, choose the Memory-Intensive specification.
Batch Processing - Container To run big data analysis tasks (such as Python and Shell task nodes) using the batch processing service, you need to request the Batch Processing - Container resource.
Data Warehouse Storage A subject-oriented and integrated data storage for Hive tables that are created in the data warehouse. Select the appropriate storage size according to actual business requirements (10 ~ 1,000G).
File Storage HDFS HDFS is used for big data analysis and storage scenarios. Data stored in File Storage HDFS can be accessed through creating Hive external tables. Select the appropriate storage size according to actual business requirements (10 ~ 1,000G).
Data Quality In order to query the quality of asset data, you need to request for the Data Quality resource.
ML Model - Container The MI Hub of Enterprise Analytics Platform provides a ML model distribution center for users and data scientists, which supports a full model registration process and hosting services for model developers. Before using MI Hub to deploy ML models, you need to request the ML Model - Container resource.
Application Hosting The EnOS Enterprise Container Platform (ECP) provides high-performance, scalable, and end-to-end container application development and deployment services to simplify the integration, maintenance, and scaling of applications. Before deploying container resources and hosting applications, you need to request for the Application Hosting resource.


For details about resource management, see Resource Specifications.