Category: 技术相关

CUDA 8 on Amazon Linux 2017.03.1 HVM

By , August 16, 2017 8:06 am

I was able to install CUDA 8 on the EC2 instance with the following steps. It should be noted that the EC2 instance was created with a root EBS volume of 100 GB to avoid running into storage space issues.

#
# STEP 1: Install Nvidia Driver
# 384.66 is a version that has support for K80
#
cd ~
sudo yum install -y gcc kernel-devel-`uname -r`
wget http://us.download.nvidia.com/XFree86/Linux-x86_64/384.66/NVIDIA-Linux-x86_64-384.66.run
sudo /bin/bash ./NVIDIA-Linux-x86_64-384.66.run
nvidia-smi

#
# STEP 2: Install CUDA Repo
#
wget https://developer.nvidia.com/compute/cuda/8.0/Prod2/local_installers/cuda-repo-rhel6-8-0-local-ga2-8.0.61-1.x86_64-rpm
sudo rpm -i cuda-repo-rhel6-8-0-local-ga2-8.0.61-1.x86_64-rpm

#
# STEP 3: Install CUDA Toolkit
#
sudo yum install cuda-toolkit-8-0
export PATH=$PATH:/usr/local/cuda-8.0/bin
nvcc –version

#
# STEP 4: Compile a sample program (deviceQuery) to use CUDA
#
cd /usr/local/cuda-8.0
sudo chown -R ec2-user:ec2-user samples
cd samples/1_Utilities/deviceQuery
make
./deviceQuery

At this point everything should be all set. I have also compiled and tested some other sample code from the samples folder and they all seemed to work.

A quick example on cuBLAS can be obtained from http://docs.nvidia.com/cuda/cublas/ . Simply copy Example 1 or Example 2 from this web page and save it as test.c, then compile and run the code with the following commands. I have tested both of them and verified them to be working.

#
# STEP 5: Compile and test cuBLAS code
#
nvcc test.c -lcublas -o test
./test

CUDA 8 on EMR with g2.2xlarge instance type

By , February 10, 2017 12:15 pm

Below is a quick recap of the steps to make CUDA 8 working on an single-node EMR cluster with the g2.2xlarge instance type. The challenges here being (1) finding a particular version of the Nvidia driver that would work with CUDA 8, and (2) installing the Nvidia driver and the CUDA Toolkit when there is only very limited disk space on /dev/xvda1.

#
# STEP 1: Install Nvidia Driver
# 367.57 is a version that has been verified to be working with CUDA 8
#
sudo yum install -y gcc kernel-devel-`uname -r`
cd /mnt
wget http://us.download.nvidia.com/XFree86/Linux-x86_64/367.57/NVIDIA-Linux-x86_64-367.57.run
sudo /bin/bash ./NVIDIA-Linux-x86_64-367.57.run
nvidia-smi

#
# STEP 2: Install CUDA Repo
# Since we have limited disk space on /dev/xvda1, we use a symbolic link as a workaround
#
cd /mnt
wget https://developer.nvidia.com/compute/cuda/8.0/Prod2/local_installers/cuda-repo-rhel6-8-0-local-ga2-8.0.61-1.x86_64-rpm
mkdir -p /mnt/cuda-repo-8-0-local-ga2
sudo ln -s /mnt/cuda-repo-8-0-local-ga2 /var/cuda-repo-8-0-local-ga2
sudo rpm -i cuda-repo-rhel6-8-0-local-ga2-8.0.61-1.x86_64-rpm

#
# STEP 3: Install CUDA Toolkit
# Since we have limited disk space on /dev/xvda1, we use a symbolic link as a workaround
#
cd /mnt
mkdir -p /mnt/cuda-8.0
sudo ln -s /mnt/cuda-8.0 /usr/local/cuda-8.0
sudo yum install cuda-toolkit-8-0
export PATH=$PATH:/usr/local/cuda-8.0/bin
nvcc –version

#
# STEP 4: Compile a sample program (deviceQuery) to use CUDA
#
cd /usr/local/cuda-8.0
sudo chown -R hadoop:hadoop samples
cd samples/1_Utilities/deviceQuery
make
./deviceQuery

At this point everything should be all set. I have also compiled and tested some other sample code from the samples folder and they all seemed to work.

A quick example on cuBLAS can be obtained from http://docs.nvidia.com/cuda/cublas/ . Simply copy Example 1 or Example 2 from this web page and save it as test.c, then compile and run the code with the following commands. I have tested both of them and verified them to be working.

#
# STEP 5: Compile and test cuBLAS code
#
nvcc test.c -lcublas -o test
./test

Emotion Coaching for Angry Customers

By , September 14, 2016 10:46 am

Slide1

This is a set of slides that I prepared on the topic of “Emotion Coaching for Angry Customers” for customer facing roles. I am making them publicly available so that more people can benefit from this work. You are more than welcome to use them to provide training to employees in your own organizations, provided that you preserve the original author information. If you need the original PowerPoint files, please drop me a note and I would be more than glad to provide them.

Slide1

When you work in a customer support role, it is inevitable that you will encounter angry customers from time to time. Is the scenario shown on the screen sound familiar to you? We ask the customer a simple question, and the customer shout at us over the phone. No matter what we say, they just won’t listen. It is so hard to talk to a customer that is angry, and we (almost always) try to avoid this type of customer as much as possible.

However, angry customers are like those lovely 1-stars. As we all know, no matter how hard you work or how good you are, 1-stars will come, and they come more than once.

If you worry about talking to angry customers, or you have been intimidated by angry customers, this training is for you.

In this training, we will talk about the theories behind anger, as well as techniques to deal with anger. First of all, we will need to understand how the brain works.

Slide1

So, how the brain works?

Most of us have been exposed to the left-brain and right brain theory to a certain degree. It is commonly believed that our left brain is responsible for rigorous reasoning such as science and mathematics (which is what we do), and our right brain is responsible for creativity such as art and entertainment (which is what we don’t do, at least during business hours). When the people we talk to stop reasoning, we tend to say that “your left brain has nothing right, and your right brain has nothing left”.

But this does not explain why people get angry.

Another theory divides our brain into four major parts the Cerebrum Cortex (or the Cortex), the Cerebellum, the Limbic System (or the Limbic), and the Brain Stem. The cortex is the largest part of the human brain, which is associated with higher brain functions such as thought and action. The cerebellum is also called the little brain, which is associated with regulation and coordination of movement, posture, and balance. The limbic system is often referred to as the “emotion brain”, which is responsible for human emotions. The brain stem is responsible for basic vital life functions such as breathing, heartbeat, and blood pressure.

As we can see, different parts of our brain are responsible for different functions. The cortex for reasoning, the cerebellum for movements, the limbic for emotions, and the brain stem for life. It should be noted that the limbic system develops in an early stage during brain development, while the cortex develops much later. Therefore we also call the limbic system the old brain and the cortex the new brain. Under certain conditions, the old brain takes over and the new brain is shut down. At this point a person is taken over by his/her emotions, and loses his/her ability to reason. If you try to reason with him/her during this period, the conversation will be very difficult because you are talking to the wrong part of the brain.

So, do not spend time and energy talking to the wrong part of the brain.

Then the question becomes, why would a person lose his/her ability to reason?

Slide1

To answer this question, we need to understand how our body respond to danger. Assuming that you are hiking in the mountain, and suddenly a huge snake appear in front of you. Different people will respond to the snake differently, but all our responses can be categorized as flight (running away), fight (ha! ho!), freeze (petrified, can’t move at all), and faint (ah ou). These coping mechanisms were developed over the evolution process, and have become the fundamental survival function of all animals.

When being confronted with a danger, we act out of instinct instead of reasoning. The limbic system takes over to cop with the danger. The cortex shuts down to keep you survive. If you want to study what snake that is, how big it is, whether it is a native or a foreign species, you do that only when you are out of the danger, not when you are in the danger.

Now assuming that our customer is running a mission critical application on our platform. Suddenly their application stops working. In each and every minute, our customer is losing user, losing customers, facing critics, while the competitors are catching up. Our customer is in a real danger, and the coping mechanism is in action.

Now, the limbic system takes over, while the cortex shuts down. If you try to reason at this point, you are talking to the wrong part of the brain.

Slide1

In such circumstance, it is very important to understand that the customer is not targeting you as a support engineer. No matter what the customer says, you need to keep calm, and don’t take it personal.

Let’s repeat three times – don’t take it personal, don’t take it personal, don’t take it personal. If there is anything I want you to take away from this training, it is “don’t take it personal”.

When the customer has lost the ability to reason, we need to be the customer’s cortex!

But how? And how long will the customer regain the ability to reason?

Slide1

To answer this question, we need to understand the difference between primary emotions and secondary emotions.

Primary emotions are those that we feel first, as a first response to a trigger. For example, we feel fear when we are threatened, we feel sadness when we hear of a death. These are the instinctive responses that we have without going through the thinking process.

Secondary emotions, on the other hand, appear after primary emotions. They usually come from a complex chain of thinking process. More importantly, secondary emotion arises when the primary emotion is overwhelming, makes us uncomfortable or vulnerable. For example, when we are threatened by somebody, we feel fear. However, the feeling of fear makes me uncomfortable, makes me feel that I am a coward. Since I don’t won’t to be seen as a coward, I feel anger. Another example is that I ask my manager for a raise but my manager refuses it. I feel frustrated, but I am not able to change anything. The feeling of frustration makes me uncomfortable, but I don’t want to be uncomfortable. Then I might become angry, or numb, or shut down.

When we experience primary emotions, we seek connections, and we pull others towards us. When we experience secondary emotions, we attack and criticize others, and we push others away.

More importantly, when we experience secondary emotions, the emotion part of the brain take control, while the reasoning part of the brain shuts down. Reasoning becomes difficult because we are talking to the wrong part of the brain.

Slide1

Now we understand that anger is a secondary emotion. The underlying primary emotion for anger is usually fear or sadness, which makes one feel uncomfortable or vulnerable.

Again, let’s assume that our customer is running a mission critical application on our platform. Suddenly their application stops working. In each and every minute, our customer is losing user, losing customers, facing critics, while the competitors are catching up. Our customer is in a real danger, and the coping mechanism is in action.

Now, our customer feels frustrated that his mission critical application is down. He feels fear about the consequence – his boss might shout at him, he might receive a lot of complains from his team members, in the worst case he might lose his job. The frustration and fear he is experiencing make him feel uncomfortable and vulnerable. When such feeling becomes overwhelming, he feels a real danger approaching, and his brain automatically switches to “Flight and Fight” mode. The emotion part of the brain takes control, and the reasoning part of the brain shuts down. As a result, he becomes angry and begins to blame and criticize.

As we just said, anger is a secondary emotion. It pushes others away, and makes the communication difficult. In this case, it is time for us to provide some emotion coaching to our customer.

Slide1

The purpose of emotion coaching is to reactivate the customer’s cortex (the reasoning part of the brain) so that he/she can start reasoning again. We do this by leading the customer from his secondary emotions back to his primary emotions. As we discussed just now, when people experiences primary emotion, he seeks connections and is open for help.

We may think that this is a very complicate process. In fact, there are only several simple steps that we need to follow.

And, it does not take long for an adult to calm down if the appropriate steps are taken.

Slide9

Listen to the customer patiently, and wait for the customer to stop talking. During this process, you can use “yeah… ah… right…” as simple acknowledgements. However, do not make any comments. You might think that the customer wants the problem to be solved as soon as possible. This is the wrong perception. For the customer, at this point his/her primary need is TO BE HEARD.

After the customer slows down / finishes talking, the first step is to name the customer’s feelings, using the names of primary emotions. For example, we can say “I can see that you are very frustrated / sad / disappointed when XYZ happens”. When we name the primary emotions, we guide the customer back to his primary emotions. Do not point out that the customer is angry, or tell the customer to calm down. This would make the customer feel shameful, which is uncomfortable for the customer. When the customer does not want to feel shameful, it is very likely that he would choose to convert this uncomfortable feeling into secondary emotions, which is going to be anger.

The second step is to validate the customer’s feelings. When a customer experiences a thread, or a lost, he has all the rights to feel sad, or disappointed, or frustrated. There is nothing wrong with such feelings, and we need to allow our customer to fully experience and express such feelings. By allowing our customer to experience and express such feelings, the customer feels that he is being listened to. Such practice builds trust, and brings intimacy between us and the customer. It opens the door for future communications.

Do not worry about the time you spend on naming and validating the customer’s feelings. It does not take long for an adult to calm down if the appropriate steps are taken. Therefore, continue to stay with the customer’s feelings when needed. You may want to repeat the previous steps when necessary. It is very unlikely that a customer would refuse your empathy.

At some point, the customer will sooth himself and calm down. The reasoning part of his brain comes back and take control. At this point, it is time to teach our customer some general relativity, quantum mechanics and wavelet theory to resolve whatever issue he has.

Slide1

When communicating with the customer, use the “I” statement as much as possible. When you use an “I” statement, you take responsibility, and avoid criticizing the customer.

In the case that a customer made a mistake, avoid using “you” or “your” in your statement. For example, “user ABC did something” is a better wording are compared to “Your user ABC did something”.

When we did something that caused the issue, you can use “your resource” to take responsibility and acknowledge the customer’s lost. For example, “the underlying hardware running your virtual machine instance failed”.

Slide1

Getting Started with AWS SDK for Java (4)

By , July 29, 2016 7:46 am

The following is an example of using the AWS SimpleDB service along with AWS KMS. Since SimpleDB does not natively integrates with KMS, we will have to encrypt the data before storing it to SimpleDB, and decrypt the data after retrieving it from SimpleDB.


import java.nio.*;
import java.util.*;
import java.nio.charset.*;

import com.amazonaws.regions.*;
import com.amazonaws.auth.profile.ProfileCredentialsProvider;
import com.amazonaws.services.simpledb.*;
import com.amazonaws.services.simpledb.model.*;
import com.amazonaws.services.kms.*;
import com.amazonaws.services.kms.model.*;


public class SDB
{

	public AmazonSimpleDBClient client;
	public AWSKMSClient kms;

	public String keyId = "arn:aws:kms:ap-southeast-2:[aws-account-id]:key/[aws-kms-key-very-long-id-ere]";
	public static Charset charset = Charset.forName("ASCII");
	public static CharsetEncoder encoder = charset.newEncoder();
	public static CharsetDecoder decoder = charset.newDecoder();

	public SDB()
	{
		client = new AmazonSimpleDBClient();
		client.configureRegion(Regions.AP_SOUTHEAST_2);

		kms = new AWSKMSClient();
		kms.configureRegion(Regions.AP_SOUTHEAST_2);

	}


	public void createDomain(String domain)
	{
		try
		{
			CreateDomainRequest request = new CreateDomainRequest(domain);
			client.createDomain(request);
		} catch (Exception e)
		{
			System.out.println(e.getMessage());
			e.printStackTrace();
		}
	}

	public void deleteAttribute(String domain, String item)
	{
		try
		{
			DeleteAttributesRequest request = new DeleteAttributesRequest(domain, item);
			client.deleteAttributes(request);
		} catch (Exception e)
		{
			System.out.println(e.getMessage());
			e.printStackTrace();
		}
	}

	public void putAttribute(String domain, String item, String name, String value)
	{
		try
		{
			ReplaceableAttribute attribute = new ReplaceableAttribute(name, value, true);
			List<ReplaceableAttribute> list = new ArrayList<ReplaceableAttribute>();
			list.add(attribute);

			PutAttributesRequest request = new PutAttributesRequest(domain, item, list);
			client.putAttributes(request);

		} catch (Exception e)
		{
			System.out.println(e.getMessage());
			e.printStackTrace();
		}
	}

	public String getAttribute(String domain, String item, String name)
	{
		String value = "Empty Result";
		try
		{
			GetAttributesRequest request = new GetAttributesRequest(domain, item);
			GetAttributesResult result = client.getAttributes(request);
			List<Attribute> list = result.getAttributes();
			for (Attribute attribute : list)
			{
				if (attribute.getName().equals(name))
				{
					return attribute.getValue();
				}
			}

		} catch (Exception e)
		{
			System.out.println(e.getMessage());
			e.printStackTrace();
		}
		return value;
	}

	public String encrypt(String message)
	{
		String result = "Encryption Error.";
		try
		{
			ByteBuffer plainText = encoder.encode(CharBuffer.wrap(message));
			EncryptRequest req = new EncryptRequest().withKeyId(keyId).withPlaintext(plainText);
			ByteBuffer cipherText = kms.encrypt(req).getCiphertextBlob();
			byte[] bytes = new byte[cipherText.remaining()];
			cipherText.get(bytes);
			result =  Base64.getEncoder().encodeToString(bytes);

			System.out.println("\nEncryption:");
			System.out.println("Original Text: " + message);
			System.out.println("Encrypted Text: " + result);
		} catch (Exception e)
		{
			System.out.println(e.getMessage());
			e.printStackTrace();
		}
		return result;
	}

	public String decrypt(String message)
	{
		String result = "Decryption Error.";
		try
		{
			byte[] encryptedBytes = Base64.getDecoder().decode(message);
			ByteBuffer ciphertextBlob = ByteBuffer.wrap(encryptedBytes);
			DecryptRequest req = new DecryptRequest().withCiphertextBlob(ciphertextBlob);
			ByteBuffer plainText = kms.decrypt(req).getPlaintext();
			result = decoder.decode(plainText).toString();

			System.out.println("\nDecryption:");
			System.out.println("Original Text: " + message);
			System.out.println("Encrypted Text: " + result);
		} catch (Exception e)
		{
			System.out.println(e.getMessage());
			e.printStackTrace();
		}
		return result;
	}

	public static void main(String[] args) 
	{
		String domainName = "demo-domain";    
		String itemName   = "demo-item";
		String attributeName    = "test-attribute";
		String attributeValue = "This is the information to be stored in SimpleDB.";

		SDB test = new SDB();
		String value = test.encrypt(attributeValue);
		test.putAttribute(domainName, itemName, attributeName, value);

		try
		{
			Thread.sleep(3000);	// Sleep for some time to make sure we can get the result
		} catch (Exception e) {}

		value = test.getAttribute(domainName, itemName, attributeName);
		test.decrypt(value);
	}


}

Getting Started with AWS SDK for Java (3)

By , February 13, 2016 10:40 am

This is the 3rd part of my tutorial on “Getting Started with AWS SDK for Java”. If you have not already do so, I suggest that you first take a look at the first chapter of this set of training “Getting Started with AWS SDK for Java (1)” to properly set up your development environment. In this part, we will cover the basic concepts related to the DataPipelineClient. Through this example, you will be able to create and activate a simple pipeline with a ShellCommandActivity running on an Ec2Resource.

Before you get started with this demo, you should get yourself familiar with what Data Pipeline is. In particular, the following AWS documentation on “Data Pipeline Concepts” is very helpful.

http://docs.aws.amazon.com/datapipeline/latest/DeveloperGuide/dp-concepts.html

First of all we create an instance of the DataPipelineClient in the constructor, then set the region to ap-southeast-2. For debugging purposes, we enable logging using log4j.

public class DemoDataPipeline
{
	static DataPipelineClient client;
	final static Logger logger = Logger.getLogger(DemoDataPipeline.class);

	/**
	 *
	 * Constructor
	 *
	 */

	public DemoDataPipeline()
	{
		// Create the DataPipelineClient
		client = new DataPipelineClient();
		// Set the region to ap-southeast-2
		client.configureRegion(Regions.AP_SOUTHEAST_2);
	}

We use the createPipeline() method in DataPipelineClient to create a new pipeline. This methods takes a CreatePipelineRequest as the parameter, which requires a name and a unique id for the pipeline to be created. Here we use the java.util.UUID utility to generate a unique id for the pipeline. This creates an empty pipeline for us.

	public void createPipeline() throws Exception
	{
		System.out.println("CREATE PIPELINE.");
		
		CreatePipelineRequest request = new CreatePipelineRequest();
		request.setName("Java SDK Demo");
		String uuid = UUID.randomUUID().toString();
		request.setUniqueId(uuid);
		client.createPipeline(request);
	}

We can use the listPipelines() method in DataPipelineClient to get a list of the pipelines. This returns a ListPipelinesResult, which includes a list of PipelineIdName objects. We traverse through this list to obtain the id and name of all the pipelines.

	public void listPipeline() throws Exception
	{
		System.out.println("LIST PIPELINE.");
		
		ListPipelinesResult result = client.listPipelines();
		List<PipelineIdName> list = result.getPipelineIdList();
		for (PipelineIdName pipeline : list)
		{
			System.out.println(pipeline.getId() + "\t- " + pipeline.getName());
		}
	}

Now we have the id of the newly created pipeline. In the AWS SDK for Java, pipeline components specifying the data sources, activities, schedule, and preconditions of the workflow are represented in PipelineObject. The following code defines a Default object, a Schedule object, an Ec2Resource object, and a ShellCommandActivity object. A PipelineObject is a collection of key-value fields. For example, the following JSON string defines an Ec2Resource in a VPC:

{
“id” : “MyEC2Resource”,
“type” : “Ec2Resource”,
“actionOnTaskFailure” : “terminate”,
“actionOnResourceFailure” : “retryAll”,
“maximumRetries” : “1″,
“instanceType” : “m1.medium”,
“securityGroupIds” : [
"sg-12345678",
"sg-12345678"
],
“subnetId”: “subnet-12345678″,
“associatePublicIpAddress”: “true”,
“keyPair” : “my-key-pair”
}

When the value of a key is another pipeline object, we use Field().withKey(“field_name”).withRefValue(“object_id”) to represent the key-value pair. Otherwise, we use Field().withKey(“field_name”).withStringValue(“field_value”) to represent the key-value pair. Please refer to the part of ShellCommandActivity in the following code for details.

	public void definePipeline(String id) throws Exception
	{
		System.out.println("Define PIPELINE.");

		// Definition of the default object
		Field defaultType = new Field().withKey("scheduleType").withStringValue("CRON");
		Field defaultScheduleType = new Field().withKey("schedule").withRefValue("RunOnceSchedule");
		Field defaultFailureAndRerunMode = new Field().withKey("failureAndRerunMode").withStringValue("CASCADE");
		Field defaultRole = new Field().withKey("role").withStringValue("DataPipelineDefaultRole");
		Field defaultResourceRole = new Field().withKey("resourceRole").withStringValue("DataPipelineDefaultResourceRole");
		Field defaultLogUri = new Field().withKey("pipelineLogUri").withStringValue("s3://331982-syd/java-dp-log");
		List<Field> defaultFieldList = Lists.newArrayList(defaultType, defaultScheduleType, defaultFailureAndRerunMode, defaultRole, defaultResourceRole, defaultLogUri);
		PipelineObject defaultObject = new PipelineObject().withName("Default").withId("Default").withFields(defaultFieldList);

		// Definition of the pipeline schedule
		Field scheduleType = new Field().withKey("type").withStringValue("Schedule");
		Field scheduleStartAt = new Field().withKey("startAt").withStringValue("FIRST_ACTIVATION_DATE_TIME");
		Field schedulePeriod = new Field().withKey("period").withStringValue("1 day");
		Field scheduleOccurrences = new Field().withKey("occurrences").withStringValue("1");
		List<Field> scheduleFieldList = Lists.newArrayList(scheduleType, scheduleStartAt, schedulePeriod, scheduleOccurrences);
		PipelineObject schedule = new PipelineObject().withName("RunOnceSchedule").withId("RunOnceSchedule").withFields(scheduleFieldList);

		// Definition of the Ec2Resource
		Field ec2Type = new Field().withKey("type").withStringValue("Ec2Resource");
		Field ec2TerminateAfter = new Field().withKey("terminateAfter").withStringValue("15 minutes");
		List<Field> ec2FieldList = Lists.newArrayList(ec2Type, ec2TerminateAfter);
		PipelineObject ec2 = new PipelineObject().withName("Ec2Instance").withId("Ec2Instance").withFields(ec2FieldList);

		// Definition of the ShellCommandActivity
		// The ShellCommandActivity is a command "df -h"
		Field activityType = new Field().withKey("type").withStringValue("ShellCommandActivity");
		Field activityRunsOn = new Field().withKey("runsOn").withRefValue("Ec2Instance");
		Field activityCommand = new Field().withKey("command").withStringValue("df -h");
		Field activityStdout = new Field().withKey("stdout").withStringValue("s3://331982-syd/dp-java-demo-stdout");
		Field activityStderr = new Field().withKey("stderr").withStringValue("s3://331982-syd/dp-java-demo-stderr");
		Field activitySchedule = new Field().withKey("schedule").withRefValue("RunOnceSchedule");
		List<Field> activityFieldList = Lists.newArrayList(activityType, activityRunsOn, activityCommand, activityStdout, activityStderr, activitySchedule);
		PipelineObject activity = new PipelineObject().withName("DfCommand").withId("DfCommand").withFields(activityFieldList);

		// setPipelineObjects
		List<PipelineObject> objects = Lists.newArrayList(defaultObject, schedule, ec2, activity);

		// putPipelineDefinition
		PutPipelineDefinitionRequest request = new PutPipelineDefinitionRequest();
		request.setPipelineId(id);
		request.setPipelineObjects(objects);
		PutPipelineDefinitionResult putPipelineResult = client.putPipelineDefinition(request);

		if (putPipelineResult.isErrored()) 
		{
			logger.error("Error found in pipeline definition: ");
			putPipelineResult.getValidationErrors().stream().forEach(e -> logger.error(e));
		}

		if (putPipelineResult.getValidationWarnings().size() > 0) 
		{
			logger.warn("Warnings found in definition: ");
			putPipelineResult.getValidationWarnings().stream().forEach(e -> logger.warn(e));
		}
	}

Now you can activate the pipeline for execution:

	public void activatePipeline(String id) throws Exception
	{
		System.out.println("ACTIVATE PIPELINE.");	

		ActivatePipelineRequest request = new ActivatePipelineRequest();
		request.setPipelineId(id);
		client.activatePipeline(request);
	}

Then, you can terminate the pipeline:

	public void deletePipeline(String id) throws Exception
	{
		System.out.println("DELETE PIPELINE.");	

		DeletePipelineRequest request = new DeletePipelineRequest();
		request.setPipelineId(id);
		client.deletePipeline(request);
	}

After checking out the demo code from github, you should modify the code to use your own S3 bucket for logging, as well as the stdout and stderr for the ShellCommandActivity. After making these changes, you can run the demo code using the following commands:

$ mvn clean; mvn compile; mvn package
$ java -cp target/demo-1.0-SNAPSHOT.jar:third-party/guava-18.0.jar -Dlog4j.configurationFile=log4j2.xml net.qyjohn.aws.DemoDataPipeline create
$ java -cp target/demo-1.0-SNAPSHOT.jar:third-party/guava-18.0.jar -Dlog4j.configurationFile=log4j2.xml net.qyjohn.aws.DemoDataPipeline list
$ java -cp target/demo-1.0-SNAPSHOT.jar:third-party/guava-18.0.jar -Dlog4j.configurationFile=log4j2.xml net.qyjohn.aws.DemoDataPipeline define df-0098814S3FS9ACXICID  (make sure you change this part using your own pipeline id)
$ java -cp target/demo-1.0-SNAPSHOT.jar:third-party/guava-18.0.jar -Dlog4j.configurationFile=log4j2.xml net.qyjohn.aws.DemoDataPipeline activate df-0098814S3FS9ACXICID  (make sure you change this part using your own pipeline id)
$ java -cp target/demo-1.0-SNAPSHOT.jar:third-party/guava-18.0.jar -Dlog4j.configurationFile=log4j2.xml net.qyjohn.aws.DemoDataPipeline delete df-0098814S3FS9ACXICID  (make sure you change this part using your own pipeline id)

Panorama Theme by Themocracy