Friday, December 9, 2016

InstaPics - Instagram Magento Extension

InstaPics is a magento extension to showcase instagram images in your online store. It will help you to fetch instagram images to your online by using #hashtag or by #UserDetails.

Social commerce is one of the biggest commerce trends from last few years. Nowadays, social media is the biggest part of everyone's day-to-day life. And Instagram is one of the most popular social media among all age groups especially among the youngster. One can increase the market value of their products by advertising the use of their products by their customers. This will surely increase the customer’s interest for your products and in some ways it will also help your customers to learn more about the ways to use the products.

Instagram images can be displayed anywhere throughout the website in a complete responsive layout. Images can be shown in either of the two fashions. They are:
  • Grid Layout: In this layout you can configure the number of images to show in a grid, size of the grid, how to fetch the images(Either by #hashtag or by #UserDetails), etc.
    Product_slider Grid View.jpg
  • Slider Layout: In this layout you can configure the complete slider - no. of images, blocks to show in one slide, pagination, movement, margin, how to fetch the images (Either by #hashtag or by #UserDetails), etc.

    Product_slider.jpg
One can insert their instagram images in any block or section or page of the their website. It will be fully responsive and can be mould in required size according to its container. It can be inserted in either ways:

Selection_025.png

Tuesday, November 22, 2016

Pragmatic has provided a connection with the social feeds of Facebook and Twitter In Odoo


With Odoo 9 Social Feed module, Pragmatic has provided a connection with the social feeds of Facebook and Twitter. The feeds from both the sites are combined and made available to the user in chronological order. Also, there is a facility of custom odoo posts which can be added to them.
  • Before starting with this module, the user needs to specify the account names in the configuration from which the feeds are to be fetched.
  • Feeds from multiple accounts can also be fetched.
  • The feeds related to the specified accounts of both facebook and twitter are fetched and then arranged in the ascending order of time elapsed and displayed to the user.
Facebook Feeds
  • To identify that a particular post is from facebook, a facebook icon is shown on top of that post.
  • Name of the account from which that post is fetched is also shown beside the icon. This can be helpful in case multiple accounts are specified.
Twitter Feeds
  • Similarly, to identify that a particular post is from twitter, a twitter icon is shown on top of that post.
  • Name of the account from which that post is fetched is also shown beside the icon. This can be helpful in case multiple accounts are specified.
Custom Feeds
  • There is also a facility to send custom feeds (posted by the odoo users).
  • The users can post the company related feeds, best wishes, or any other newsfeeds internally which will be available to other odoo users.
  • Internal posts are to be written in the textbox given at the bottom of the page.
  • Also, you can attach an image to the text message, which will be describing your message.
  • The image icon beside the textbox allows you to select the images available on your local machine.
  • User has to upload the image, which will then get added in the textbox.
  • After the message is completed, click on the Post button.
  • Once, clicked on the Post button, the message will be posted above along with the time ago tag.
Source page of the feed
  • Complete feed of facebook and twitter are not displayed here as it may occupy lots of space. Hence, the 'read more..' option is provided.
  • The 'read more..' option is given at the end of every feed for viewing the it in detail.
  • On click of the link the user will be redirected to the source page of that feed.

Sunday, November 13, 2016

Odoo 10 Community Edition on AWS Marketplace


Pragmatic has launched Odoo 10 on the Amazon Cloud Marketptace, With a click on a button you can launch an instance of Odoo 10 Community Edition


Odoo 10 Community Edition Features

Event

  • Event Barcode
  • Email Schedule : Easy to Followup

Account

HR

  • HR Attendance (Kiosk Mode)
  • Print Employee Badge
  • Sign In/Out authenticate using Pin
  • Timesheet Apps : usability Improvement

POS

  • POS Serial Number
  • POS Default Cash Control
  • POS Restaurant : Transfer order from one table to another

Stock

  • Option in Payment Gateway to auto confirm SO
  • Out of stock warning in option
  • Delivery : choose the package when clicking on put in pack
  • Inventory : Can apply Quality control on picking
  • Serial Number Upstream Tracebility
  • Delivery Order : Add margin in % to cover lost
  • Picking : Up/Down Tracebility

Website/ Ecommerce:

  • B2B/B2C
  • New checkout design for address selection
  • Ecommerce : Add multiple images of products
  • Improved Portal Frotend view
  • Easy to set website Favicon
  • Ecommerce insight: Save, manage, and reuse credit cards. Authorize amount at checkout and capture at shipping.
  • Ecommerce user can pay through stored card
  • Easy to trace Website Orders, Invoices

Expense :

  • Accountant can direct pay the Expense
  • Email Alias to direct record Expense (based on expense internal reference, system identify product and create expense accordingly)

General / Discuss :

  • history of chatter clickable (source document)
  • The debug mode does not split the web assets by default
  • Keyboard shortcuts detailed on the top right menu from the home page
  • Easy to maintain user access (set default access to default user)
  • Search date ranges quickly with the new in-between operator.
  • Canned responses and /commands in discuss
  • Create in one click
  • Company setting of apps moved to Apps > Setting
  • Any HTML type report easily edit in app view => https://drive.google.com/file/d/0B21cUNlAdZ6gWHl5NUE4b0lqc0k/view?usp=drivesdk

Studio:

  • Easy to Create new Apps
  • Easy to add new field in either form view or tree view
  • Channge string, help message, views, reports,....

Purchase :

  • Editable PO: Easy to edit confirmed PO
  • Purchase Teander : Blanket order Type

Project:

  • Project : project dashboard is now based on user's favorite
  • Project : Easy to maintain Sub-Task
  • Forecast : Grid : By User, By Project

Subscription:

  • Subscription dashboard by company, Tag, Contract
  • Subscription : New Cohort Analysis

Helpdesk Management:

Easy to assign tickets by different Assignation Method
  • Randomly
  • Manually
  • Balanced
Generate Tickets
  • Email Alias
  • Live Chat
  • Ticket Form
  • External API
Performance
  • SLA Pollicies
  • Rating
  • Canned Response
Self-Service
  • Forum- Help Center
  • Slides- eLearning

MRP

  • PLM
  • MPS
  • Maintenance
  • Quality
  • Easy to know "Overall Equipement Effectivness"
  • Unbuild Order , Scrap Products

Thursday, October 20, 2016

Odoo 10 Helpdesk Management

Helpdesk is new module introduced with release of Odoo 10. This module help maintaining helpdesk of the company with various features listed below.

Create ticket using multiple channels
  • User can create tickets manually or configure it so that incoming email create tickets or using web site form to create tickets. Third party application can also be conneted via API and tickets can be created using web srevice.

Ticket status can be tracked by new, in progress, solved or cancelled stages.

Priority is assigned to tickets such as low priority, high priority and urgent.
Assign ticket to user
  • Help desk team to be configured to handle tickets generated. User can define helpdesk team members and assign tickets to them by manually, randomly and balanced methods.

SLAs (Service Level Agreements)
  • Configure service level agreements and automate related checks and actions by SLA policies such as time required for urgent tickets.

Dashboard is showing various statistics for helpdesk such as average open hours, SLA complied or not, priority of tickets, performance (actual vs trageted.) and success rate.

Tuesday, October 18, 2016

Odoo 10 Manufacturing MRP Enterprise Edition v/s Odoo 10 Manufacturing MRP Community Edition Features Comparison




Features / Apps Odoo 10 Community Odoo 10 Enterprise
Manufacturing Orders
BOM
Routes
Work Orders
Plant Floor Dashboard ×
Work Center Planning ×
Master Production Schedule ×
Quality ×
PLM ×
Preventive and Corrective Maintenance ×
Barcode Interface ×
Tablet and Mobile Support ×
KPIs Statistics and Dashboards ×

Wednesday, October 5, 2016

Odoo on AWS Cloud



What is Odoo?



Odoo is Open Source Business Application. Modern Software for Smart Businesses. Boost your sales, step up productivity and manage all day-to-day activities. Fully integrated, simple and mobile. Odoo is flexible and evolving.

Introduction


  • Business suite of applications ERP, CRM, HRM and more.
  • Open source.
  • Low cost.
  • Provide standard application.
  • Modularity.
  • Easy to make tricks and improve features.
  • Grew fastly and keeps growing unstoppably.
  • Web based.
  • Fully integrated.


Odoo Features


Odoo is now the All-in-one business software


Common Challenges faced while Deploying and Hosting Odoo






Odoo Deploying On Cloud



Pragmatic Odoo provides a one-click install solution for Odoo. Run your own Odoo server in the cloud. Odoo isn’t just one application. It’s hundreds. Odoo is an enterprise resource platform from which you can manage all your business operations – from supply chain and project management, to accounting and HR. Out of the box, Odoo includes messaging, sales CRM and reporting modules. Click the settings tab and you’ll be presented with nearly 2000 other modules – bug tracking, project management, timesheets, MRP, recruiting, calendar, warehouse management, and much, much more – that can be deployed with one click.

IPaas Scalable, Reliable, High Performance Platform



Pragmatic has powered Odoo on Amazon Web Services (AWS) which provides a highly reliable, scalable, low-cost infrastructure platform in the fast growing cloud spectrum that fuels millions of businesses trans globally.
Deploy your enterprise apps using Odoo on AWS and enjoy the benefits of a low-cost, pay-as-you-go pricing and elastic capacity on a global cloud infrastructure with data centers around the globe.

Features of Odoo iPaas




We have 4 plans


Plans
Single Server Odoo 9 @AWS-Cloud Platform-Apricot Multi Server Odoo 9 @AWS-Cloud Platform-Orange Multi Server Odoo 9 @AWS-Cloud Platform-Apple Multi Server Odoo 9 @AWS-Cloud Platform-Mango
For a start - the best startup
For daily use with high benefits
The best of both worlds
The King of indulgence - Live life King Size
Best Suited: Startups, developers
Best Suited: Small Business
Best Suited: Mid-sized Business
Best Suited: Large Enterprises
Go in detail of each plan with features
High Performance Operation Heap
High Performance Operation Heap
High Performance Operation Heap
High Performance Operation Heap
1 Odoo Server and Postgres Database
1 Odoo Server
1 Load balancer
Application Firewall
PHPPg Admin
1 Database Server
2 Odoo server
1 Load balancer
Highly secure & reliable
Database & System Images
1 Postgres Database Server
Auto Scaling (Webserver)
Odoo Custom Modules
PHPPg Admin
Redis Cache
Redis Cache
Odoo 9 Community Edition
Highly Secure & Reliable
Database & System Images
1 Postgres Database Server
-
Odoo Custom Modules
PHPPg Admin
Database & System Images
-
-
Highly Secure & Reliable
PHPPg Admin

Tuesday, October 4, 2016

HIPAA Compliance with AWS


AWS HIPAA Compliance


Amazon Web Services (AWS) to create HIPAA (Health Insurance Portability and Accountability Act)-compliant applications. HIPAA Privacy and Security Rules for protecting Protected Health Information (PHI).

HIPAA and HITECH impose requirements related to the use and disclosure of PHI, appropriate safeguards to protect PHI, individual rights, and administrative responsibilities.

Covered entities and their business associates can use the secure, scalable, low-cost IT provided by Amazon Web Services (AWS) to architect applications in alignment with HIPAA and HITECH compliance requirements. AWS services and data centers have multiple layers of operational and physical security to help ensure the integrity and safety of customer data. AWS, offers a standardized Business Associate Addendum (BAA) for such customers. AWS service in an account designated as a HIPAA Account, but they may only process, store and transmit PHI using the HIPAA-eligible services defined in the AWS BAA.

Amazon Web Services which are HIPAA Compliant


  • Amazon DynamoDB
  • Amazon Elastic Block Store (Amazon EBS)
  • Amazon Elastic Compute Cloud (Amazon EC2)
  • Elastic Load Balancing
  • Amazon Elastic MapReduce (Amazon EMR)
  • Amazon Glacier
  • Amazon Redshift
  • Amazon Relational Database Service (Amazon RDS) for MySQL
  • Amazon RDS for Oracle
  • Amazon Simple Storage Service (Amazon S3)


HIPAA architectures on AWS


AWS provides multiple services to deploy a highly available, scalable, secure application stack, which can serve a limitless variety of healthcare applications and use cases. In this blog, we will embark on a journey into HIPAA-eligible architectures by scoping the discussion to the following deployment diagram, which can be adopted as a starting point for building a HIPAA-eligible, web-facing application.


The underlying theme to this architecture is encryption everywhere.

HIPAA ON AWS PROCESS


1) Obtain a Business Associate Agreement with AWS

Once you have determined that storing, processing, or transmitting protected health information (PHI) is absolutely necessary, before moving any of this data to AWS infrastructure you must contact AWS and make sure you have all the necessary contracts and a Business Associate Agreement (BAA) in place. These contracts will serve to clarify and limit, as appropriate, the permissible uses and disclosures of protected health information.

2) Authentication and Authorization

The authentication and authorization mechanisms you define for your HIPAA-eligible system must be documented as part of a System Security Plan (SSP) with all roles and responsibilities documented in detail along with a configuration control process that specifies initiation, approval, change, and acceptance processes for all change requests. Although the details of defining these processes won’t be discussed here, the AWS Identity and Access Management (AWS IAM) service does offer the granular policies required for achieving the necessary controls under HIPAA and HITECH.
enable multi-factor authentication (MFA) on your AWS root account and lock away the access keys
I AM account that has significant privileges in your AWS account

3) Web and Application Layers

DNS resolution is relatively straightforward and can be achieved using Amazon Route 53. Just be sure not to use any PHI in the URLs.
Amazon Elastic Load Balancer Configuration
The primary entity that receives the request from Amazon Route 53 is an Internet-facing Elastic Load Balancer. There are multiple ways in which an ELB load balancer can be configured, as explained here. To protect the confidential PHI data, you must enable secure communication options only, like HTTPS-based or TCP/SSL-based end-to-end communication. Although you can use TCP/SSL pass-through mode on the ELB load balancer for your web tier requests, using this option limits the use of some of the HTTP/HTTPS specific features like sticky sessions and X-Forward-For headers. For this reason, many startups prefer to make use of HTTPS-based communication on ELB, as shown in the following screenshot.


As shown in the configuration, there’s a single listener configured that accepts HTTPS requests on port 443 and sends requests to back-end instances using HTTPS on port 443. Because HTTPS is used for the front-end connection, you must create the certificate as per your publicly accessible domain name, get the certificate signed by a CA (for an internal load balancer you can use a self-signed certificate as well), and then upload the certificate using AWS IAM, which manages your SSL certificates, as explained in the ELB documentation. This certificate is then utilized to decrypt the HTTPS-based encrypted requests that are received by the ELB load balancer.

To route the requests from the ELB load balancer to the back-end instances, you must use back-end server authentication so that the communication is encrypted throughout. You can enable this by creating a public key policy that uses a public key for authentication. You use this public key policy to create a back-end server authentication policy. Finally, you enable the back-end server authentication by setting the back-end server authentication policy with the back-end server port, which in this case would be 443 for an HTTPS protocol. For an example of how to set this up easily using OpenSSL, check out the ELB documentation and Apache Tomcat’s documentation on certificates.

WAF/IDS/IPS Layer Many of our customers make use of an extra layer of security (like web application firewalls and intrusion detection/prevention solutions) in front of their web layer to avoid any potential malicious attacks to their sensitive applications. There are multiple options available in the AWS Marketplace to provision tools like WAF/IDS/IPS, etc. So you could start from there instead of setting it up from scratch on an EC2 instance.

Web Layer
The next layer is the web tier, which could be auto-scaled for high availability and placed behind an internal ELB load balancer with only a HTTPS listener configured. To further secure the access to web servers, you should open up your web server instance's’ security group to accept requests only from the designated load balancer, as shown in the following diagram.


App Layer
Encryption of traffic between the web layer and app layer will look similar to the setup in the preceding diagram. Again, there will be an internal ELB load balancer with HTTPS listener configured. On the application servers, SSL certificates are set up to keep the communication channel encrypted end-to-end.
Both the app and web layers should also be in private subnets with auto-scaling enabled to ensure a highly responsive and stable healthcare application.

4) Database Layer
The easiest way to get started with database encryption is to make use of Amazon RDS (MySQL or Oracle engine). To protect your sensitive PHI data, you should consider the following best practices for Amazon RDS:

  • You should have access to the database enabled only from the application tier (using appropriate security group/NACL rules).
  • Any data that has the potential to contain PHI should always be encrypted by enabling the encryption option for your Amazon RDS DB instance, as shown in the following screenshot. Data that is encrypted at rest includes the underlying storage for a DB instance, its automated backups, read replicas, and snapshots.


  • For encryption of data in-transit, MySQL provides a mechanism to communicate with the DB instance over an SSL channel, as described here. Likewise, for Oracle RDS you can configure Oracle Native Network Encryption to encrypt the data as it moves to and from a DB instance.
  • For encryption of data at rest, you could also make use of Oracle’s Transparent Data Encryption (TDE) by setting the appropriate parameter in the Options Group associated with the RDS instance. With this, you can enable both TDE tablespace encryption (encrypts entire application tables) and TDE column encryption (encrypts individual data elements that contain sensitive data) to protect your PHI data. You could also store the Amazon RDS Oracle TDE Keys by leveraging AWS CloudHSM, a service that provides dedicated Hardware Security Module (HSM) appliances within the AWS cloud. More details on this integration are available here.
    For additional discussion on Amazon RDS encryption mechanisms, please refer back to the whitepaper.


5) Backup/Restore
To protect your patient data, you should be vigilant about your backup and restore processes. Most AWS services have mechanisms in place to perform backup so that you can revert to a last known stable state if any changes need to be backed out. For example, features like EC2 AMI creation or snapshotting (as in the Amazon EBS, Amazon RDS, and Amazon Redshift services) should be able to meet the majority of backup requirements.
You can also make use of third-party backup tools, which integrate with Amazon S3 and Amazon Glacier to manage secure, scalable, and durable copies of your data. When using Amazon S3, you have multiple ways to encrypt your data at rest and can leverage both client-side encryption and server-side encryption mechanisms. Details on these options are available in the Amazon S3 documentation. PHI in S3 buckets should always be encrypted. You can also enforce the server-side encryption (SSE) option on any of the buckets by adding the following condition to your Amazon S3 bucket policy:

“Condition”: {
“StringEquals”: {
“s3:x-amz-server-side-encryption”:”AES256"
},
“Bool”: {
“aws:SecureTransport”: “true”
}
},
For security of data in transit, you should always use Secure Sockets Layer (SSL) enabled endpoints for all the services, including Amazon S3 for backups. If you are enabling backup of your data from the EC2 instances in a VPC to Amazon S3, then you could also make use of VPC endpoints for Amazon S3. This feature creates a private connection between your private VPC and Amazon S3 without requiring access over the Internet or a NAT/proxy device.

6) EC2 and EBS requirements

Amazon EC2 is a scalable, user-configurable compute service that supports multiple methods for encrypting data at rest, ranging from application-level or field-level encryption of PHI as it is processed, to transparent data-encryption features of commercial databases, to the use of third-party tools. For a more complete discussion of the options, see the whitepaper.
In the next example, we show you a simple approach to architecting HIPAA-eligible web servers.
First, you must be sure that your EC2 instance is running on hardware that is dedicated to a single customer by using a dedicated instance. You can do this by setting the tenancy attribute to “dedicated” on either the Amazon VPC that the instance is launched in, the Auto-Scaling Launch Configuration, or on the instance itself, as shown in the following screenshot.


Because Amazon Elastic Block Store (Amazon EBS) storage encryption is consistent with HIPAA guidance at the time of this blog writing, the easiest way to fulfill the at-rest encryption requirement is to choose an EC2 instance type that supports Amazon EBS encryption, and then add the encrypted EBS volume to your instance. (See the EBS link for a list of instance types.)


You should keep all of your sensitive PHI data on the encrypted EBS volumes, and be sure never to place PHI on the unencrypted root volume.
You might want to take some additional precautions to ensure that the unencrypted volume does not get used for PHI. For example, you can consider a partner solution from the AWS Marketplace, which offers full-drive encryption to help you feel more at ease. This will help to ensure that if there ever is a program (such as a TCP core dump) that uses the root drive as temporary storage or scratch space without your knowledge, it will be encrypted. Other startups have developed their own techniques for securing the root volume by using Logical Volume Management (LVM) to repartition the volume into encrypted segments and to make other portions read-only.

7) Key Management

At every turn in this architecture, we have mentioned encryption. Ensuring end-to-end encryption of our PHI is an essential component of keeping our data secure. Encryption in flight protects you from eavesdroppers, and encryption at rest defends against hackers of the physical devices. However, at some point we do need to open this ciphertext PHI in order to use it in our application. This is where key management becomes a “key” piece of the implementation (pun intended).
AWS does not place limitations on how you choose to store or manage your keys. Essentially, there are four general approaches to key management on AWS:

  1. Do it yourself
  2. Partner solutions
  3. AWS CloudHSM
  4. AWS KMS


A full discussion (or even a good starting discussion) on key management far exceeds what we can provide in a single blog entry, so we will just provide some general advice about key management as it relates to HIPAA.
The first piece of advice is that you should strongly consider the built-in AWS option. All of the checkbox encryption methods — such as Amazon S3 server-side encryption, Amazon EBS encrypted volumes, Amazon Redshift encryption, and Amazon RDS encryption make it very easy to keep your PHI encrypted and you should explore these options to see if these tools meet your BAA requirements and HHS guidance. These methods automate or abstract many of the tasks necessary for good key maintenance such as multifactor encryption and regular key rotation. AWS handles the heavy lifting and ensures that your encryption methods are using one of the strongest block ciphers available.
If you need to create a separation of duties between staff that maintain the keys vs. developers who work with the keys, or if you would simply like additional control of your keys and want to be able to easily create, control, rotate and use your encryption keys then you should look at using the Amazon Key Management Service (KMS). This service is still integrated with AWS SDKs and other AWS services like AWS CloudTrail, which can help provide auditable logs to help meet your HIPAA compliance requirements.
If you need additional controls beyond what is provided by AWS, you should be sure that you have proper security experts who can ensure the safe management of your encryption keys. Remember, a lost key could render your entire dataset useless, and AWS Support will not have any way to help a problematic situation.
For more on encryption and key Management in AWS, check out this video from last year’s re:Invent, and read the Securing Data at Rest with Encryption whitepaper.

8) Logging and Monitoring

Logging and monitoring of system access will play a starring role in your HIPAA-eligible architecture. The goal is to put auditing in place to allow security analysts to examine detailed activity logs or reports to see who had access, IP address entry, what data was accessed, etc. The data should be tracked, logged, and stored in a central location for extended periods of time in case of an audit.
At the AWS account level, be sure to launch AWS CloudTrail and immediately start recording all AWS API calls. You should also launch AWS Config, which will provide you with an AWS resource inventory, configuration history, and configuration change notifications.
You will also need to monitor and maintain the logs of your AWS resources for keeping a record of system access to PHI as well as running analytics that could serve as part of your HIPAA Security Risk Assessment. One way to do this is with AWS CloudWatch, a monitoring service that you can use to collect server logs from your EC2 instances as well as logs from the Amazon RDS DB instance, Amazon EBS volumes, and the ELB elastic load balancer. You can even develop custom metrics to obtain the necessary log information from your own applications.
CloudWatch has other useful features:

  • View graphs and statistics on the console
  • Set up alarms to automatically notify you of abnormal system behavior
  • Capture network traffic in a single repository through the integration of CloudWatch with VPC Flow Logs


With all these logging mechanisms, you want to be sure that no PHI is actually stored in the logs. This usually requires some special attention. For example, sometimes you might need to encrypt PHI in your custom metric before sending to AWS CloudTrail. You also should be aware of everything that is coming into the logs. For example, the combination of session user and IP address coming from the ELB logs is considered PHI in some situations, so you should catch these special circumstances to be sure PHI is fully scrubbed from the logs.
Finally, Amazon S3 is a fantastic repository for all these logs. However, take extra precautions to lock down the permissions for log access of these highly sensitive data sets. You might want to consider some more stringent access requirements such as requiring multi-factor authentication to read the logs, turning on versioning to retain any logs that get deleted, or even setting up cross-region replication to keep a second copy of the logs in an entirely different AWS account.

AWS Environment


AWS environment to meet your HIPAA Compliance needs, but we will also provide ongoing managed services 24/7 to help ensure that your AWS environment remains HIPAA Compliant. Our HIPAA Compliance Support Plan for AWS includes a comprehensive suite of security and support features designed to specifically address the HIPAA and HITECH standards, including the necessary levels of encryption within AWS.

Encryption and Protection of PHI in AWS



implement encryption, customers may evaluate and take advantage of the encryption features native to the HIPAA-eligible services or they can satisfy the encryption requirements through other means consistent with the Guidance. The following sections provide high-level details about using available encryption features in each of the HIPAA-eligible services and other patterns for encrypting PHI. A final section describes how AWS KMS can be used to encrypt the keys used for encryption of PHI on AWS.

Amazon EC2


Amazon EC2 is a scalable, user-configurable compute service that supports multiple methods for encrypting data at rest.
For example, customers might select to perform application- or field-level encryption of PHI as it is processed within an application or database platform hosted in an Amazon EC2 instance.
Approaches range from encrypting data using standard libraries in an application framework such as Java or .NET; leveraging Transparent Data Encryption features in Microsoft SQL or Oracle; or by integrating other third-party and software as a service (SaaS)-based solutions into their applications. Customers can choose to integrate their applications running in Amazon EC2 with AWS KMS SDKs, simplifying the process of key management and storage. Customers can also implement encryption of data at rest using file-level or full disk encryption (FDE) by utilizing third-party software from AWS Marketplace Partners or native file system encryption tools (such as dm-crypt, LUKS, etc.).

Network Control
Network traffic containing PHI must encrypt data in transit. For traffic between external sources (such as the Internet or a traditional IT environment) and Amazon EC2, customers should use industry-standard transport encryption.

Mechanisms such as TLS or IPsec virtual private networks (VPNs), consistent with the Guidance. Internal to an Amazon Virtual Private Cloud (VPC) for data traveling between Amazon EC2 instances, network traffic containing PHI must also be encrypted; most applications support TLS or other protocols providing in-transit encryption that can be configured to be consistent with the Guidance. For applications and protocols that do not support encryption, sessions transmitting PHI can be sent through encrypted tunnels using IPsec or similar implementations between instances.

Amazon EC2 instances that customers use to process, store, or transmit PHI are run on Dedicated Instances, which are instances that run in an Amazon VPC on hardware dedicated to a single customer. Dedicated Instances are physically isolated at the host hardware level from instances that are not Dedicated Instances and from instances that belong to other AWS accounts. For more information on Dedicated Instances, see
http://docs.aws.amazon.com/AmazonVPC/latest/UserGuide/dedicated-instance.html.

Customers can launch Amazon EC2 Dedicated Instances in several ways:

  • Set the tenancy attribute of an Amazon VPC to “dedicated” so that all instances launched into the Amazon VPC will run as Dedicated Instances
  • Set the placement tenancy attribute of an Auto-Scaling Launch Configuration for instances launched into an Amazon VPC
  • Set the tenancy attribute of an instance launched into an Amazon VPC


Amazon Virtual Private Cloud offers a set of network security features well-aligned to architecting for HIPAA compliance. Features such as stateless network access control lists and dynamic reassignment of instances into stateful security groups afford flexibility in protecting the instances from unauthorized network access. Amazon VPC also allows customers to extend their own network address space into AWS, as well as providing a number of ways to connect their data centers to AWS. VPC Flow Logs provide an audit trail of accepted and rejected connections to instances processing, transmitting or storing PHI. For more information on Amazon VPC, see http://aws.amazon.com/vpc/.

Amazon Elastic Block Store


Amazon EBS encryption at rest is consistent with the Guidance that is in effect at the time of publication of this whitepaper. Because the Guidance might be updated, customers should continue to evaluate and determine whether Amazon EBS encryption satisfies their compliance and regulatory requirements. With Amazon EBS encryption, a unique volume encryption key is generated for each EBS volume; customers have the flexibility to choose which master key from the AWS Key Management Service is used to encrypt each volume key. For more information.

Amazon Redshift


Amazon Redshift provides database encryption for its clusters to help protect data at rest. When customers enable encryption for a cluster, Amazon Redshift encrypts all data, including backups, by using hardware-accelerated Advanced Encryption Standard (AES)-256 symmetric keys. Amazon Redshift uses a four-tier, key-based architecture for encryption. These keys consist of data encryption keys, a database key, a cluster key, and a master key. The cluster key encrypts the database key for the Amazon Redshift cluster. Customers can use either AWS KMS or an AWS CloudHSM (Hardware Security Module) to manage the cluster key. Amazon Redshift encryption at rest is consistent with the Guidance that is in effect at the time of publication of this whitepaper. Because the Guidance might be updated, customers should continue to evaluate and determine whether Amazon Redshift encryption satisfies their compliance and regulatory requirements.

Amazon S3


Customers have several options for encryption of data at rest when using Amazon S3, including both server-side and client-side encryption and several methods of managing keys.

Amazon Glacier


Amazon Glacier automatically encrypts data at rest using AES 256-bit symmetric keys and supports secure transfer of customer data over secure protocols.
Connections to Amazon Glacier containing PHI must use endpoints that accept encrypted transport (HTTPS). For a list of regional endpoints.

Amazon RDS for MySQL


Amazon RDS for MySQL allows customers to encrypt MySQL databases using keys that customers manage through AWS KMS. On a database instance running with Amazon RDS encryption, data stored at rest in the underlying storage is encrypted consistent with the Guidance in effect at the time of publication of this whitepaper, as are automated backups, read replicas, and snapshots. Because the Guidance might be updated, customers should continue to evaluate and determine whether Amazon RDS for MySQL encryption satisfies their compliance and regulatory requirements. For more information on encryption at rest using Amazon RDS.

Amazon RDS for Oracle


Customers have several options for encrypting PHI at rest using Amazon RDS for Oracle. Customers can encrypt Oracle databases using keys that customers manage through AWS KMS. On a database instance running with Amazon RDS encryption, data stored at rest in the underlying storage is encrypted consistent with the Guidance in effect at the time of publication of this whitepaper, as are automated backups, read replicas, and snapshots. Because the Guidance might be updated, customers should continue to evaluate and determine whether Amazon RDS for Oracle encryption satisfies their compliance and regulatory requirements. For more information on encryption at-rest using Amazon RDS, see

Elastic Load Balancing


Customers may use Elastic Load Balancing to terminate and process sessions containing PHI. Customers may choose either the Classic Load balancer or the Application Load Balancer. Because all network traffic containing PHI must be encrypted in transit end-to-end, customers have the flexibility to implement two different architectures:

Customers can terminate HTTPS, HTTP/2 over TLS (for Application) , or SSL/TLS on Elastic Load Balancing by creating a load balancer that uses an encrypted protocol for connections. This feature enables traffic encryption between the customer’s load balancer and the clients that initiate HTTPS , HTTP/2 over TLS, or SSL/TLS sessions, and for connections between the load balancer and customer back-end instances. Sessions containing PHI must encrypt both front-end and back-end listeners for transport encryption. Customers should evaluate their certificates and session negotiation policies and maintain them consistent to the Guidance. For more information.

Amazon EMR


Amazon EMR deploys and manages a cluster of Amazon EC2 instances into a customer’s account. All Amazon EC2 instances that process, store, or transmit PHI must be Dedicated Instances. In order to meet this requirement, EMR clusters must be created in a VPC with tenancy attribute of “dedicated.” This ensures that all cluster nodes (instances) launched into the VPC will run as Dedicated Instances.

Amazon DynamoDB


Connections to Amazon DynamoDB containing PHI must use endpoints that accept encrypted transport (HTTPS). For a list of regional endpoints, see http://docs.aws.amazon.com/general/latest/gr/rande.html#ddb_region.

PHI stored in Amazon DynamoDB must be encrypted at-rest consistent with the Guidance. Amazon DynamoDB customers can use the application development framework of their choice to encrypt PHI in applications before storing the data in Amazon DynamoDB. Alternatively, a client-side library for encrypting content is available from the AWS Labs GitHub repository. Customers may evaluate this implementation for consistency with the Guidance. For more information, see https://github.com/awslabs/aws-dynamodb-encryption-java. Careful consideration should be taken when selecting primary keys and when creating indexes such that unsecured PHI is not required for queries and scans in Amazon DynamoDB.

Using AWS KMS for Encryption of PHI


Master keys in AWS KMS can be used to encrypt/decrypt data encryption keys used to encrypt PHI in customer applications or in AWS services that are integrated with AWS KMS. AWS KMS can be used in conjunction with a HIPAA account, but PHI may only be processed, stored, or transmitted in HIPAA-eligible services. KMS does not need to be a HIPAA-eligible service so long as it is used to generate and manage keys for applications running in other HIPAA-eligible services. For example, an application processing PHI in Amazon EC2 could use the GenerateDataKey API call to generate data encryption keys for encrypting.

and decrypting PHI in the application. The data encryption keys would be protected by customer master keys stored in AWS KMS, creating a highly auditable key hierarchy as API calls to AWS KMS are logged in AWS CloudTrail.

Auditing, Back-Ups, and Disaster Recovery


HIPAA’s Security Rule also requires in-depth auditing capabilities, data back-up procedures, and disaster recovery mechanisms. The services in AWS contain many features that help customers address these requirements.

In designing an information system that is consistent with HIPAA and HITECH requirements, customers should put auditing capabilities in place to allow security analysts to examine detailed activity logs or reports to see who had access, IP address entry, what data was accessed, etc. This data should be tracked, logged, and stored in a central location for extended periods of time, in case of an audit. Using Amazon EC2, customers can run activity log files and audits down to the packet layer on their virtual servers, just as they do on traditional hardware. They also can track any IP traffic that reaches their virtual server instance. A customer’s administrators can back up the log files into Amazon S3 for long-term reliable storage.

Under HIPAA, covered entities must have a contingency plan to protect data in case of an emergency and must create and maintain retrievable exact copies of electronic PHI. To implement a data back-up plan on AWS, Amazon EBS offers persistent storage for Amazon EC2 virtual server instances. These volumes can be exposed as standard block devices, and they offer off-instance storage that persists independently from the life of an instance. To align with HIPAA guidelines, customers can create point-in-time snapshots of Amazon EBS volumes that automatically are stored in Amazon S3 and are replicated across multiple Availability Zones, which are distinct locations engineered to be insulated from failures in other Availability Zones. These snapshots can be accessed at any time and can protect data for long-term durability. Amazon S3 also provides a highly available solution for data storage and automated back-ups. By simply loading a file or image into Amazon S3, multiple redundant copies are automatically created and stored in separate data centers.

accessed at any time, from anywhere (based on permissions), and are stored until intentionally deleted.

Disaster recovery, the process of protecting an organization’s data and IT infrastructure in times of disaster, is typically one of the more expensive HIPAA requirements to comply with. This involves maintaining highly available systems, keeping both the data and system replicated off-site, and enabling continuous access to both. AWS inherently offers a variety of disaster recovery mechanisms.

With Amazon EC2, administrators can start server instances very quickly and can use an Elastic IP address (a static IP address for the cloud computing environment) for graceful failover from one machine to another. Amazon EC2 also offers Availability Zones. Administrators can launch Amazon EC2 instances in multiple Availability Zones to create geographically diverse, fault tolerant systems that are highly resilient in the event of network failures, natural disasters, and most other probable sources of downtime. Using Amazon S3, a customer’s data is replicated and automatically stored in separate data centers to provide reliable data storage designed to provide 99.99% availability.

For more information on disaster recovery, see the AWS Disaster Recovery whitepaper available at http://aws.amazon.com/disaster-recovery/.

Reduces Time. Reduces Cost. Reduces Risk


Aws allows benefit of customers to reduce their cost to become HIPAA Compliant in AWS, and it significantly reduces the time required as well by avoiding costly delays and mistakes. We understand what AWS components are not supported in a HIPAA environment, which ones are supported, and how to implement them to meet the HIPAA standards. Connectria’s staff will also assist you in getting a Business Associate Agreement (BAA) signed with Amazon, and we will enter into a BAA directly with each of our customers as well.

Connectria’s security controls and processes go far beyond AWS, and extend throughout our entire company to all of our employees. Each of our staff members are required to take and pass HIPAA Compliance certification, and we undergo an annual HIPAA HITECH Assessment by a qualified 3rd party assessor to ensure that Connectria and our employees continue to meet the HIPAA HITECH standards. Connectria provides access to our HIPAA Compliance Team at no additional cost in order to assist our customers with achieving their HIPAA Compliance objectives.

Many of our customers include Independent Software Vendors (ISVs) who serve the healthcare market and require HIPAA compliance. Some also wish to move their applications to a hosted Software as a Service (SaaS) model. Whether you are a Covered Entity, a Business Associate, or a technology provider to the healthcare market, Connectria can help you implement and manage a HIPAA Compliant environment in AWS.

Hipaa Compliant Website


  • Information that is being transported must ALWAYS be encrypted.
  • PHI is backed up and is recoverable.
  • Using unique access controls the information is only accessible by Authorized personnel.
  • The information is not tampered with or altered.
  • Information can be permanently disposed of when no longer needed.
  • Information located on a server that is secured by HIPAA security rule requirements and/or a web server company who you have a HIPAA Business Associate Agreement with.
https://nuancedmedia.com/hipaa-compliant-website-design/

Monday, October 3, 2016

Pentaho DB related Queries

Database host on AWS



Database Questions


1. MySQL Bulk Loader step in Pentaho: We are having issues with the Fifo file parameter. Can use this step when Spoon is installed on a Windows machine? We are running Pentaho locally and piping the data into AWS. Please see below screen capture.


ANS:- Fifo File - This is the fifo file used as a named pipe. When it does not exist it will be created with the command mkfifo and chmod 666 (this is the reason why it is not working in Windows).
Circumvention: Use the MySQL bulk loader job entry to process a whole file (suboptimal). Not supported but worth to test: mkfifo and chmod are supported by the GNU Core Utilities.

2. We received an error message relating to the 'path to the psql client' in PostgreSQL bulk loader step. How can we find and apply the path to the psql client on our amazon EC2 instance running PostgreSQL?

ANS:- First we need to define a parameter called psql_path in kettle.properties file.
E.g. psql_path=c\:/Program Files (x86)/pgAdmin III/1.16/psql.exe
Then we need to set Bulk Loader's "Path to the psql client" property, we can use ${psql_path}


3. What parameters should be set to increase data transfer speeds to a postgres database?

ANS:- Performance PostgreSQL
optimization settings of PostgreSQL, depend not only on the hardware configuration, but also on the size of the database, the number of clients and the complexity of queries, so that optimally configure the database can only be given all these parameters.
PostgreSQL settings (add/modify this settings in postgresql.conf and restart database):
  1. max_connections = 10
  2. shared_buffers = 2560MB
  3. effective_cache_size = 7680MB
  4. work_mem = 256MB
  5. maintenance_work_mem = 640MB
  6. min_wal_size = 1GB
  7. max_wal_size = 2GB
  8. checkpoint_completion_target = 0.7
  9. wal_buffers = 16MB
  10. default_statistics_target = 100


4. If there are unallowed characters for postgres text field, what is the best way to handle those; ie ASCI Null?

Ans:- solution 1:- The NULLIF function returns a null value if value1 equals value2; otherwise it returns value1. This can be used to perform the inverse operation of the COALESCE

solution2:- There are different ways to handle special characters. E.g. Escaping single quotes ' by doubling them up ->''is the standard way and works of course. E.g. 'user's log'    'user''s log'

5. We are moving data from a DB2 database to AWS. The goal is to update the data in less than 8 hours. We have nine tables and the largest table includes about 130 million rows. Is this feasible? What is the best way to implement this strategy on AWS?

Ans:- solution 1:- In the first two parts of this series we discussed two popular products--out of many possible solutions--for moving big data into the cloud: Tsunami UDP and Data Expedition’s ExpeDat S3 Gateway. Today we’ll look at another option that takes a different approach: Signiant Flight.

solution 2:- AWS Import/Export is a service you can use to transfer large amounts of data from physical storage devices into AWS. You mail your portable storage devices to AWS and AWS Import/Export transfers data directly off of your storage devices using Amazon's high-speed internal network. Your data load typically begins the next business day after your storage device arrives at AWS. After the data export or import completes, we return your storage device. For large data sets, AWS data transfer can be significantly faster than Internet transfer and more cost effective than upgrading your connectivity

solution 3:- Snowball is a petabyte-scale data transport solution that uses secure appliances to transfer large amounts of data into and out of the AWS cloud. Using Snowball addresses common challenges with large-scale data transfers including high network costs, long transfer times, and security concerns. Transferring data with Snowball is simple, fast, secure, and can be as little as one-fifth the cost of high-speed Internet.

6. What is the largest dataset (relational database table) that Pragmatic has moved to AWS? How long did it to update such a table? What performance strategies did Pragmatic undertake to achieve peak performance for updating such a table?

Ans:- If you look at typical network speeds and how long it would take to move a terabyte dataset:


Depending on the network throughput available to you and the data set size it may take rather long to move your data into Amazon S3. To help customers move their large data sets into Amazon S3 faster, we offer them the ability to do this over Amazon's internal high-speed network using AWS Import/Export.

7. What is Pragmatic suggested approach for setting up ETL architecture for an AWS based datacenter?

Ans:- With Amazon Simple Workflow (Amazon SWF), AWS Data Pipeline, and, AWS Lambda, you can build analytic solutions that are automated, repeatable, scalable, and reliable. In this post, I show you how to use these services to migrate and scale an on-premises data analytics workload.

Workflow basics


A business process can be represented as a workflow. Applications often incorporate a workflow as steps that must take place in a predefined order, with opportunities to adjust the flow of information based on certain decisions or special cases. The following is an example of an ETL workflow:

The graphic below is an overview of how SWF operates.


8. Rather than using Pentaho CE for ETL and reporting, what do you think are the advantages/disadvantages of implementing a hybrid environment running Pentaho ETL and Tableau Server? Have you implemented such a mixed environment for any of your clients?

Ans:- This can be done. Tableau does not have ETL. So we can use Pentaho ETL with Tableau. We have worked in combination with Tableau and Pentaho.You can use Pentaho for ETL & visualize data using tableau.

9. Do you have any clients in the United States that use Pragmatic support for Pentaho?

Ans:- We are a products and services company working primarily in ERP< CRM, BI and Analytics. We have worked with several customers from United States and can give you a reference for ERP deployment and report generation.

10. Do you have any clients in the United States that used Pragmatic consulting services for setting up their ETL architecture? If so, do you mind listing them as a referral?

Ans:- We have customers who have used our AWS consulting expertise not only limited to Pentaho, in the United States but in entire world in countries such as Australia,New Zealand, Switzerland, Belgium. We have also deployed scalable architectures on AWS cloud.But unfortunately most of these are companies are middle men and since we have signed NDA with them, we cannot declare their names. But we can definitely give you reference of companies in United STates with whom we have worked with other technologies such as ERP. Will that work for you?

Odoo 10 Community and Enterprise Edition Features - MRP + Maintenance+ PLM + Quality

Wednesday, September 28, 2016

Internet-of-Things Work Scope- intel edison



Internet of Things (IoT) has become biggest disruptive technology in world, Through this technology world reaches for greater connectivity in WAN. Nowadays every company working on AWS IoT technology. IoT services providing quick answering in one media to another media. Internet of Things and its services are becoming part of our everyday life, ways of working, and business. This is the Information and Communications Technology.

About Internet of Things?

AWS IoT can connect billions of devices and send trillions of messages, and can process and route those messages to AWS endpoints and to other devices in a reliable and secure manner. With AWS IoT, your applications can communicate with all your devices, all the time. AWS IoT makes it super easy to use AWS services such as DynamoDB, RDS, Lambda, Kinesis, S3, and Machine Learning, o build IoT applications that gather, process, analyze and act on data generated by connected devices completely in the cloud.

AWS IOT Architecture



AWS IOT Hardware Device Intel® Edison and Grove IoT Starter Kit Powered by AWS


The bundle includes the Grove IoT Environmental Kit* from Seeed Studios, a rapid-prototyping kit for designing indoor applications based on the Intel® Edison development board, and Amazon Web Services* (AWS), a suite of services that enables secure, bidirectional communications between the device and the cloud. AWS IoT* is a platform that allows devices — cars, turbines, sensor grids, light bulbs and more -- to connect to AWS services so companies can store, process, analyze, and act on the volumes of data generated by connected devices on a global scale. With a base shield that can connect up to 11 different sensors and actuators and access to AWS, you can easily create a new Internet of Things (IoT) device to explore and interact with your indoor environment. AWS services extends the functionality of the Grove Indoor Environmental Kit* for Intel Edison, adding the ability to transform , augment, or route messages to the AWS cloud with secure authentication from X.509 certificates installed on your device. You can also control how your IoT clients such as micro controllers, sensors, actuators, mobile devices, or applications connect to the AWS cloud with built-in services and SDKs to fine-tune communication, rules, and roles.

Parts List:


Board/Part Qty Documentation
Intel® Edison for Arduino 1 Read Here
Base Shield 1 Read Here
Grove - Temperature&Humidity Sensor (High-Accuracy & Mini) 1 Read Here
Grove - Moisture Sensor 1 Read Here
Grove - Light Sensor 1 Read Here
Grove - UV Sensor 1 Read Here
Grove - PIR Motion Sensor 1 Read Here
Grove - Encoder 1 Read Here
Grove - Button 1 Read Here
Grove - LCD RGB Backlight 1 Read Here
Grove - Relay 1 Read Here
Grove - Servo 1 Read Here
Grove - Buzzer 1 Read Here
USB Cable; 480mm-Black 1 -
USB Wall Power Supply 1 -

Project Scope of Work


We expect that the team works on getting all 5 sensors listed below connect using the AWS Iot Architecture shown above. Once the device is connected to AWS we should be able to capture the data inside Dynamodb. The complete project should be able to use the Device Gateway, Device Shadow with TLS authentication and MQTT protocol.. The sensors to be used are defined below

Intel® device With AWS IoT Architecture



Intel® Edison for Arduino



FEATURES


  • Uses a 22nm Intel® SoC that includes a dual core, dual threaded Intel® Atom™ CPU at 500MHz and a 32-bit Intel® Quark™ microcontroller at 100 MHz. It supports 40 GPIOs and includes 1GB LPDDR3, 4 GB EMMC, and dual-band WiFi and BTLE on a module slightly larger than a postage stamp.
  • The Intel Edison module will initially support development with Arduino* and C/C++, followed by Node.JS, Python, RTOS, and Visual Programming support in the near future.
  • It includes a device-to-device and device-to-cloud connectivity framework to enable cross-device communication and a cloud-based, multi-tenant, time-series analytics service.
  • Has an SD card connector, micro USB or standard sized USB host Type-A connector(via mechanical switch), Micro USB device, 6 analog inputs, and 20 ditial input/output pins, 1x UART, 1x I2C, and 1x ICSP 6-pin header (SPI) Power jack with 7V-15V DC input.

Sensors



Grove - Temperature & Humidity Sensor (High-Accuracy & Mini)


This is a multifunctional sensor that gives you temperature and relative humidity information at the same time. It utilizes a TH02 sensor that can meet measurement needs of general purposes. It provides reliable readings when environment humidity condition in between 0-80% RH, and temperature condition in between 0-70°C, covering needs in most home and daily applications that don't contain extreme conditions.

Grove - Moisture Sensor




The Grove - Moisture Sensor can be used to detect the moisture of soil, to judge if there is dampness around the sensor. It can be used to decide if the plants in a garden needs watering. It can be used in gardens to automate watering plants. It can be used very easily by just inserting the sensor into the soil and reading the output using ADC.

Grove - Light Sensor



The Grove - Light sensor module uses GL5528 photoresistor(light dependent resistor) to detect the intensity of light in the environment. The resistance of photoresistor decreases when the intensity of light increases. A dual OpAmp chip LM358 on board produces voltage corresponding to intensity of light(i.e based on resistance value). The output signal from this module will be HIGH in bright light and LOW in the dark. This module can be used to build a light controlled switch i.e switch off lights during day time and switch on lights during night time.

Grove - UV Sensor



The Grove – UV Sensor is used for detecting the intensity of incident ultraviolet(UV) radiation. This form of electromagnetic radiation has shorter wavelengths than visible radiation. It is based on the sensor GUVA-S12D.It has a wide spectral range of 200nm-400nm. The module will output electrical signal which is varied with the change of the UV intensity. UV sensors are used for determining exposure to ultraviolet radiation in laboratory or environmental settings.

Grove - PIR Motion Sensor



This is a simple to use PIR motion sensor with Grove compatible interface. Simply connect it to Stem shield and program it, when anyone moves in its detecting range, the sensor outputs HIGH on its SIG pin.
The detecting range and response speed can be adjusted by 2 potentiometers soldered on its circuit board, The response speed is from 0.3s - 25s, and max 6 meters of detecting range.

AWS Device Shadow


A thing shadow (sometimes referred to as a device shadow) is a JSON document that is used to store and retrieve current state information for a thing (device, app, and so on). The Thing Shadows service maintains a thing shadow for each thing you connect to AWS IoT. You can use thing shadows to get and set the state of a thing over MQTT or HTTP, regardless of whether the thing is connected to the Internet.

Device Shadows Data Flow


The Thing Shadows services acts as an intermediary, allowing devices and applications to retrieve and update thing shadows.
The Thing Shadows service uses a number of MQTT topics to facilitate communication between applications and devices. To see how this works, use the AWS IoT MQTT client to subscribe to the following MQTT topics with QoS 1:
$aws/things/myLightBulb/shadow/update/accepted
The Thing Shadows service sends messages to this topic when an update is successfully made to a thing shadow.
$aws/things/myLightBulb/shadow/update/rejected
The Thing Shadows service sends messages to this topic when an update to a thing shadow is rejected.
$aws/things/myLightBulb/shadow/update/delta
The Thing Shadows service sends messages to this topic when a difference is detected between the reported and desired sections of a thing shadow.
$aws/things/myLightBulb/shadow/get/accepted
The Thing Shadows service sends messages to this topic when a request for a thing shadow is made successfully.
$aws/things/myLightBulb/shadow/get/rejected
The Thing Shadows service sends messages to this topic when a request for a thing shadow is rejected.





AWS IoT Device Gateway


The AWS IoT Device Gateway enables devices to securely and efficiently communicate with AWS IoT. The Device Gateway can exchange messages using a publication/subscription model, which enables one-to-one and one-to-many communications. With this one-to-many communication pattern AWS IoT makes it possible for a connected device to broadcast data to multiple subscribers for a given topic. The Device Gateway supports MQTT, WebSockets, and HTTP 1.1 protocols and you can easily implement support for proprietary or legacy protocols. The Device Gateway scales automatically to support over a billion devices without provisioning infrastructure.

Amazon DynamoDB


Amazon DynamoDB is a fast and flexible NoSQL database service for all applications that need consistent, single-digit millisecond latency at any scale. It is a fully managed cloud database and supports both document and key-value store models. Its flexible data model and reliable performance make it a great fit for mobile, web, gaming, ad tech, IoT, and many other applications. Start today by downloading the local version of DynamoDB, then read our Getting Started Guide.

Reference Material


Interactive tutorial
AWS IoT is a managed cloud service that lets connected devices -- cars, light bulbs, sensor grids and more -- easily and securely interact with cloud applications and other devices.
This interactive tutorial will help you to get started quickly by demonstrating the following service features:
  • Connect things to the Device Gateway
  • Process and act on data with the Rules Engine
  • Read and set device state with Device Shadows



IoT news and features


The Internet of Things is an evolving term used to describe objects and their virtual representations communicating via an internet-like network. The concept has been discussed since 1991, when the initial idea was based on control networks that would allow for the remote control and monitoring devices, inventory and factory functions. Today, the term Internet of Things relates to the advanced connectivity of devices, systems and services, going beyond merely machine-to-machine (M2M) communication. It is estimated that by 2020 there will be over 25 billion devices wirelessly connected to the Internet of Things, including embedded and wearable computing devices. We track this emerging phenomenon.

Application Scope?


Iot has used in wide application depending upon the cloud network mode, level and coverage WAN area intromission. In days every companies have provision to move on the AWS IOT . CEO Jeff Immelt said that a global network connecting people, data and machines called the Industrial Internet had the potential to add $10 to $15 trillion to global GDP in the next 20 years. GE plans to invest $1 billion in the "development of industrial internet technology and applications to make customers more productive." IoT concept is working fully automated monitoring and reporting,Utility,Plants and Animals filed in the internet, IOT applications ranking area Smart home,Smart City,Smart grids,Industrial Internet, and Connected Health (Digital health/Telehealth/Telemedicine)