Configuring VMware Carbon Black Cloud Connectors

This connector allows Stellar Cyber to ingest VMware Carbon Black Cloud alerts from an AWS S3 bucket and add the alerts to the data lake.

If there is asset information in the logs for Events and Alerts, the detected assets are reported in the Assets Index.

Connector Overview: VMware Carbon Black Cloud

Capabilities

  • Collect: Yes

  • Respond: No

  • Native Alerts Mapped: Yes

  • Runs on: DP

  • Interval: Configurable

Collected Data

Content Type

Index

Locating Records

Alert

Syslog

Assets

 

msg_class:

carbonblack_alert

msg_origin.source:

carbonblack_cloud

msg_origin.vendor:

vmware_carbonblack

msg_origin.category:

endpoint

Domain

N/A

Response Actions

N/A

Third Party Native Alert Integration Details

Alerts of type CB Analytics, Device Control, and Watchlist.

For details, see Integration of Third Party Native Alerts.

Required Credentials

  • Carbon Black Organization Key, Bucket, Prefix, Access Key ID, and Secret Access Key

               Let us know if you find the above overview useful.

Adding a VMware Carbon Black Cloud Connector

To add a VMware Carbon Black Cloud connector:

  1. Obtain the VMware Carbon Black organization ID
  2. Gather / Configure AWS S3 information
  3. Add the connector in Stellar Cyber
  4. Test the connector
  5. Verify ingestion

Obtaining the VMware Carbon Black Organization ID

Use of this connector requires that you have previously configured your VMware Carbon Black deployment to send data to AWS S3. You still require the Organization ID to configure the connector. You can find this in your VMware Carbon Black Cloud portal under Settings | API Access | API Keys.

Gathering / Configuring the AWS S3 Information 

Use this section to obtain the following required details for use in configuring your connector:

  • Access Key ID

  • Secret Access Key

  • Bucket

  • Prefix

Creating a User

To add a user with the appropriate permissions:

  1. Log in to your AWS Management Console at https://aws.amazon.com/console. View the services in the Console Home or choose View all services.

  2. Choose IAM. The IAM Dashboard appears.

  3. Choose Policies and then choose Create Policy.

  4. In the Create policy pane, choose the JSON tab.

  5. Using the example below as a guide, edit the JSON policy document.

    The following is just an EXAMPLE. You must modify this JSON to match the resources in your own environment.

    Copy
    VMware Carbon Black policy
    {
        "Version": "2012-10-17",
        "Statement": [
            {
                "Sid": "VisualEditor0",
                "Effect": "Allow",
                "Action": [
                    "s3:GetObject",
                    "s3:ListBucket"
                ],
                "Resource": [
                    "arn:aws:s3:::stellarcarbonblack/*",
                    "arn:aws:s3:::stellarcarbonblack"
                ]
            }
        ]
    }
  6. Choose Next.

  7. Give your policy a name to associate it with Stellar Cyber, then choose Create policy.

    The policy can now be attached to a user.

  8. From the IAM navigation pane, choose Users and then choose Create user.

  9. In the Specify user details page for User details, enter a User name for the new user.

  10. Choose Provide user access to the – AWS Management Console optional to produce login credentials for the new user, such as a password.

  11. Choose how you want to create the Console password and then choose Next.

  12. For Set permissions, choose Attach policies directly. Then search for the policy you created above.

  13. Select the checkbox to the left of the policy name, then choose Next.

  14. Verify the information and then choose Create user.

Creating an Access Key

  1. Log in to the IAM Console with your AWS account ID, your IAM user name, and password. You must have the permissions required to create access keys for a user.

  2. Choose Users, then choose the user name, and click Security credentials.

  3. In the Access keys section, choose Create access key. Access keys have two parts: an access key ID and a secret access key. Click Create access key.

  4. On the Access key best practices & alternatives page, choose Other and then choose Next.

  5. Click Create access key.

  6. On the Retrieve access keys page, choose Show to reveal the value of the user's secret access key. Save the access key ID and secret access key in a secure location. You will need them when configuring the connector in Stellar Cyber.

  7. Click Done.

Creating and Configuring the S3 Bucket

  1. Navigate to the AWS S3 Console (https://s3.console.aws.amazon.com/s3/home).

  2. From the navigation pane, select Bucket to ensure the buckets dashboard is displayed.

  3. Click Create bucket. Give the bucket a unique name that does not contain uppercase letters or underscores; dashes are allowed.

  4. Verify the AWS region is set to the desired value (for example, US East (N. Virginia) us-east-1).

  5. Ensure the check-box for Block all Public Access is enabled/checked. The connector does not require public access set.

  6. At the bottom of the screen, click the button to Create bucket.

  7. From the AWS S3 list of buckets, click the name of the bucket you just created.

  8. From the bucket details pane, click the button to Create folder. This is the folder to which you will send data from the VMware Carbon Black console. Note the precise folder name. You will use this folder name to replace the prefix in the bucket policy in the next step, in the connector, and when you add a Data Forwarder in the Carbon Black Cloud console.

    Each VMware Carbon Black Data Forwarder requires its own folder. Otherwise, data from multiple forwarders can mix in the same folder and impair parsing of data.

  9. Now, click the Permissions tab and click the button to Edit the Bucket Policy.

  10. Modify the Resource parameter as indicated below to include your Carbon Black product region and the name of the folder you created above. This pane includes buttons to sample policies and to an AWS Policy Generator tool you can use to create the JSON for your bucket policy.

    {
    	"Version": "2012-10-17",
    	"Statement": [
    		{
    			"Sid": "Statement1",
    			"Principal": {},
    			"Effect": "Allow",
    			"Action": [],
    			"Resource": []
    		}
    	]
    }

    Following is an example of a complete policy with the region, folder, and allowed actions specified, using the AWS Policy Generator tool. The most critical element to include is the Resource field.

    {
      "Version": "2012-10-17",
      "Id": "Policy1647298852280",
      "Statement": [
        {
          "Sid": "Stmt1647298832372",
          "Effect": "Allow",
          "Principal": {
            "AWS":
              "arn:aws:iam::132308400445:role/mcs-psc-prod-event-forwarder-us-east-1-event-forwarder"
            },
          "Action": [
            "s3:PutObject",
            "s3:PutObjectAcl"
          ],
          "Resource":
          "arn:aws:s3:::bucket-name/prefix-folder-name/*"
        } 
      ]
    }

At this point you should have made note of the following information from your VMware and AWS steps, above.

  • Organization ID
  • Access Key ID

  • Secret Access Key

  • Bucket

  • Prefix

Adding the Connector in Stellar Cyber

With the access information handy, you can add a VMware Carbon Black Cloud connector in Stellar Cyber:

  1. Log in to Stellar Cyber.

  2. Click System | Integration | Connectors. The Connector Overview appears.

  3. Click Create. The General tab of the Add Connector screen appears. The information on this tab cannot be changed after you add the connector.

  4. Choose Endpoint Security from the Category drop-down.

  5. Choose VMware Carbon Black Cloud from the Type drop-down.

  6. For this connector, the supported Function is Collect, which is enabled already.

  7. Enter a Name.

    This field does not accept multibyte characters.

  8. Choose a Tenant Name. The Interflow records created by this connector include this tenant name.

  9. Choose the device on which to run the connector.

    • Certain connectors can be run on either a Sensor or a Data Processor. The available devices are displayed in the Run On menu. If you want to associate your collector with a sensor, you must have configured that sensor prior to configuring the connector or you will not be able to select it during initial configuration. If you select Data Processor, you will need to associate the connector with a Data Analyzer profile as a separate step. That step is not required for a sensor, which is configured with only one possible profile.

    • If the device you're connecting to is on premises, we recommend you run on the local sensor. If you're connecting to a cloud service, we recommend you run on the DP.

  10. (Optional) When the Function is Collect, you can create Log Filters. For information, see Managing Log Filters.

  11. Click Next. The following Configuration tab appears for the Collect function.

  12. For the Collect function: 

    1. Enter the Carbon Black Cloud Organization Key you noted above in Gathering / Configuring the AWS S3 Information.

    2. Enter the Bucket. This is the AWS S3 bucket you noted above, configured in Carbon Black Event Forwarder.

    3. Enter the Prefix you noted above, configured in Carbon Black Event Forwarder. (For this connector, it is the folder you created in the bucket above.)

      You can just enter alerts if the S3 bucket has the following prefix syntax: alerts/org_key=my_org/year=2022/month=11/day=23/hour=11/minute=56/

    4. Enter the Access Key ID you copied earlier.

    5. Enter the Secret Access Key you copied earlier.

    6. Choose the Interval (min). This is how often the logs are collected.

    7. Choose the Content Type you would like to collect. The logs for Alert are supported.

  13. Click Next. The final confirmation tab appears.

  14. Click Submit.

    To pull data, a connector must be added to a Data Analyzer profile if it is running on the Data Processor.

  15. If you are adding rather than editing a connector with the Collect function enabled and you specified for it to run on a Data Processor, a dialog box now prompts you to add the connector to the default Data Analyzer profile. Click Cancel to leave it out of the default profile or click OK to add it to the default profile.

    • This prompt only occurs during the initial create connector process when Collect is enabled.

    • Certain connectors can be run on either a Sensor or a Data Processor, and some are best run on one versus the other. In any case, when the connector is run on a Data Processor, that connector must be included in a Data Analyzer profile. If you leave it out of the default profile, you must add it to another profile. You need the Administrator Root scope to add the connector to the Data Analyzer profile. If you do not have privileges to configure Data Analyzer profiles, a dialog displays recommending you ask your administrator to add it for you.

    • The first time you add a Collect connector to a profile, it pulls data immediately and then not again until the scheduled interval has elapsed. If the connector configuration dialog did not offer an option to set a specific interval, it is run every five minutes. Exceptions to this default interval are the Proofpoint on Demand (pulls data every 1 hour) and Azure Event Hub (continuously pulls data) connectors. The intervals for each connector are listed in the Connector Types & Functions topic.

    The Connector Overview appears.

The new connector is immediately active and collects logs beginning today.

Testing the Connector

When you add (or edit) a connector, we recommend that you run a test to validate the connectivity parameters you entered. (The test validates only the authentication / connectivity; it does not validate data flow).

  1. Click System | Integrations | Connectors. The Connector Overview appears.

  2. Locate the connector that you added, or modified, or that you want to test.

  3. Click Test at the right side of that row. The test runs immediately.

    Note that you may run only one test at a time.

Stellar Cyber conducts a basic connectivity test for the connector and reports a success or failure result. A successful test indicates that you entered all of the connector information correctly.

To aid troubleshooting your connector, the dialog remains open until you explicitly close it by using the X button. If the test fails, you can select the  button from the same row to review and correct issues.

The connector status is updated every five (5) minutes. A successful test clears the connector status, but if issues persist, the status reverts to failed after a minute.

Repeat the test as needed.

ClosedDisplay sample messages...

Success !

Failure with summary of issue:

Show More example detail:

 

Verifying Ingestion

To verify ingestion:

  1. Click Investigate | Threat Hunting. The Interflow Search tab appears.
  2. Change the Indices to Syslog or Alerts. Or change the Indices to Assets to view asset data ingested from the logs for Events and Alerts. The table immediately updates to show ingested Interflow records.