Tuesday, September 2, 2014

What is Systems Integration?

Many a time, singly tested software components need to be combined or integrated into a whole. This integration could be either of subsystem components that are combined into products, or when components are combined into subsystems. The combined nomenclature given to process is called Systems Integration.

Why should systems be integrated?
If systems integration did not offer a few distinct advantages, there is no need to do it. So, what advantages does System Integration bring? It does carry some basic advantages. We need to understand it this way: a product is made into several components before being fully assembled to form a full product. These “builds” or units can be joined incrementally, in a vertical manner. These units may relate to subsystems, or may traverse subsystem areas to produce a partial end-to-end product. In either case, what has to be ensured is that integration has to be done over set stages, so that each unit in the incremental build offers results in each phase when it is closer to the end product.
In addition, keeping this practice of integration-ready units or components ready makes it easy for developers to not only get a real, rather than a simulated environment for actual integration work; they could also cut down on costs significantly by identifying problems at every stage making changes as and when required. The practice of phased or staged integration brings about this advantage. When even complex problems are identified and addressed during early stages of software integration, the result is a better and more compliant product.

References:

An understanding of project management

Well, project management is something we all put into practice at various times of our life, although in different situations and circumstances. If we wanted to plan a party, what would we do? We would organize every segment of the event. We would call up the accessories person. Then, we would invite guests, and then host the party, right? So, how does all this relate to project management? In other words, what is project management?
In order to understand what project management is; we need to understand what a project is. As we know, a project is an activity or set of activities that we undertake to complete a given piece of work that leads to a defined result. We saw the example of organizing the party. Project management is not about simply rushing into doing things. The eventual goal or work that we plan to achieve should be approached in an orderly and structured manner. This makes the accomplishment of the goal easy and effective.

Planning is the key
Similarly, project management involves having to plan towards reaching the target. We have to first what the work (project) is about. Then, we need to go about achieving it by using resources in the optimal manner. We have to ensure that the resources, be they physical, human or economic, need to be put to the best use to minimize cost and maximize effectiveness.

Why is project management important?
In the face of competition, organizations need to economize if they have to increase their margins and profits. Getting their project management right goes a long way in helping them do this. A properly planned project management is essential for organizations to help understand how effectively – cost and otherwise –they are doing in order to stay the course. Initiation, plan, execution, control and monitoring and closing are all important elements of project planning. Getting these right means that organizations can improve their bottom line. 
References:

An understanding of Enterprise Architecture

An easy way of understanding Enterprise Architecture is to understand the term “architecture” in the general sense first. It means the style or character of a building, right? Relate this to an enterprise, and you have a definition of Enterprise Architecture. An enterprise, as a technology-oriented business is generally known, also has its architecture. This architecture defines the character of the enterprise.
Just as how a building has its own features that constitute its architecture; an enterprise has its unique ones, too. A building’s architecture has its elements, the sum of all of which it consists of. Similarly, an enterprise has to have its architecture, or what may be called its constituents. So, what are these? Broadly, these can be said to be Business Architecture, Information Architecture and IT Architecture. Having said this, let us move on to an understanding of the uses of Enterprise Architecture.

Getting strategies right
The most important reason for which Enterprise Architecture evolved is that it was noticed that more than four fifths of organizations’ strategies failed. The reason for this was not the ineffectiveness of the strategy itself, but rather the implementation. Over the last couple of decades or so, Enterprise Architecture has evolved as a result of an understanding of this fact, and the need to address it.
Professionals put together a concrete and well-defined practice for analyzing, designing, planning and implementing a set of ideas to fully understand and implement strategies. This involves identifying the information, processes and the technological aspects of an enterprise and implementing solutions using a systematic approach. The entire purpose is to take the enterprise to higher levels of performance with the minimum hitches. This approach helps enterprises and organizations of various kinds to come up with strategies for minimizing loss.

The various Enterprise Architecture programs and certifications
Since Enterprise Architecture has evolved highly over the past few years, it has grown into a full-fledged profession with its own certifications, programs and forums. At present, these are the most popular certifications and programs for Enterprise Architecture, used in a number of organizations worldwide:
o   TOGAF®;
o   ArchiMate®;
o   The Open Group Exploration, Mining, Metals & Minerals vertical (EMMM) Forum;
o   The Open Certified Architect (Open CA) program (formerly ITAC); and
o   The Open Group Business Forum.

References:
http://feapo.org/wp-content/uploads/2013/11/Common-Perspectives-on-Enterprise-Architecture-v15.pdf

Database management is crucial for organizations

In the simplest sense, as the term suggests, database management is the management of data. Of course, it is implied that database management became a major discipline mainly because of the humungous amount of data that need to be managed. Organizations typically have not just huge amounts of data; their data is diffuse and complex, too. This is why database management has evolved into a full field.
Database management can be considered “…the monitoring, administration, and maintenance of the databases and database groups in (an) enterprise”. From this, and given the nature of the huge volumes of databases that organizations typically handle; it is clear that a database management system: a) manages very large volumes of data; b) efficiently supports access to this huge amount of data; c) supports multiple sets of data without causing mayhem by understanding their relationship with each other.

Important elements of database management
Database management is of critical importance to organizations, since sensitive data are stored in the databases. There are two aspects that database management needs looking into: a) monitoring these databases; b) assessing the strength and performance of the databases. A brief looksee into each of these:

Monitoring: A database monitoring system has to be comprehensive, because it has to monitor the entire data across the organization. The aim of having a monitoring system is that it helps the organization identify the problems in its database environment that is hindering its performance. Most database management systems use data from the Automatic Workload Repository (AWR) for displaying performance information and starting database alerts.

Assessing the performance: The key to assessing the performance of database management systems is to first monitor, from where assessment takes off. A comprehensive diagnosis, which is what monitoring does, is the basis for assessment. Database management systems usually allow users to access the Automatic Database Diagnostic Monitor (ADDM), which allows the AWR to give snapshots of database activity for a set period. This is used to provide recommendations for better usage.

References:
http://www.webopedia.com/TERM/D/database_management_system_DBMS.html

Understanding product line architectures

A product line architecture (PLA) can be defined as a set or family or group of elements that coordinate with each other to provide a defined product functionality. It defines not only element types, but also how they interact, in addition to how these are mapped to the product functionality.
Additionally, product line architecture could also define a few instances of the architectural elements. We could use this term to refer to a set of products within an organization.

How do we define PLA?
We can think of product line architecture (PLA) as being a draft for creating groups or families of related applications. Product line architecture proceeds on the premise that it is wiser for organizations to produce sets or groups of intimately related products than to build individual products.

Why should organizations need to go for PLA?
Product line architectures are important for a very critical reason: software keeps becoming obsolete and complex at a breakneck speed. This calls for very strong effort in keeping the costs of software development and maintenance down. With product line architecture, an organization can make good the effort it puts into software design and development, which it can implement across a variety of products, leading to major reduction in costs.
It is for this simple, commonsensical reason that creating multiple products is more prudent and cost and time saving than individual ones. This principle has some correlation to economies of scale: just as how it makes greater sense to build a mold that manufactures more products than have one mold for each; PLA too creates standards or frameworks on which smaller and more diverse products can be built with lesser effort and time.
Since about four decades, the benefits of having multiple software families are being experienced by organizations.

References:
http://www.cs.utexas.edu/ftp/predator/stja.pdf

What is computer architecture?

Computer architecture can be thought of as a protocol by which the various technologies that go into a computer are designed to interact with each other. It can be described as a specification which details the way in which all the software and hardware technology standards in a computer interact in forming a computer system or platform. To make a simpler understanding, it can be referred to the way a computer system is designed, as well as the technologies it can work with.

Why “architecture”?
Why do we have the word “architecture” in this terminology? It is used to denote the method by which the needs and requirements of the all the elements related to computers, such as the user, system, or technology work with each other. Architecture relates to the standards and designs that are based on those needs and requirements.
Computer architecture can be said to have originated with Von Neumann. The Von Neumann architecture, dating to the end of World War II, is still valid and is used by almost all kinds of computers to this day. A creation of the mathematician John Von Neumann; it explains the core design of an electronic computer and its parts, and the way they interact with each other: the CPU being the heart of the system, in which is included the main functions of the computer.

The three categories of computer architecture
Computer architecture consists of three categories:
System Design: System design has in it all the hardware components of the system. These include the CPU, memory controllers, the graphics processing unit, data paths and other items such as virtualization and multiprocessing apart of course, from the data processors.

Instruction Set Architecture (ISA): The ISA is the embedded programming language that goes into the CPU. The ISA gives a clear definition of the CPU's capabilities and functions, which are based on the programming it is capable of performing or processing. The important elements of ISA are the processor register types, memory addressing modes, data formats, word size, and the instruction set used by programmers.
Microarchitecture: This is another word for computer organization. Data processing, data paths and storage elements, in addition to how they have to be implemented in the ISA are defined by this type of architecture.

Reference:
http://www.eitaglobal.com/control/w_product/~product_id=300140REC
http://www.eitaglobal.com/control/w_product/~product_id=300142LIVE
http://www.techopedia.com/definition/26757/computer-architecture

Sunday, March 23, 2014

Data Center Design - Webinar By EITAGlobal

Overview: Data centers seldom meet the operational and capacity requirements of their initial designs. The principal goals in data center design are flexibility and scalability, which involve site location, building selection, floor layout, electrical system design, mechanical design and modularity. Creating a sound data center design is one of the most critical steps to assure long term goals for sustainability, flexibility and power savings. I will share my knowledge and expertise, gained from working on data center projects around the world, to support a better understanding of the essential subjects for all data center professionals. 

Why should you attend: Selecting the location and designing a datacenter has over 100 crucial elements and decisions that need to be made to ensure successful operation long term. Even if your are thinking of retrofitting an existing data center, these technical best practices will reduce the likelihood of a catastrophic infrastructure outage that could put your company out of business.

Areas Covered in the Session:

  • Is a data center the right option
  • Location
  • Planning
  • Facilities
  • Infrastructure design
  • Energy efficiency
  • Operations

Who Will Benefit:

  • CIO
  • CFO
  • CSO
    Data center Managers
  • Data center Designers
  • Data center Operators
Speaker Profile:
Craig Borysowich has over 25 years of Technology Consulting experience with both public and private sector clients, including over ten years in Technical Leadership roles. Craig has extensive background in working with large scale, high-profile systems integration and development projects that span throughout a customer’s organization. He has extensive background in designing robust solutions that bring together multiple platforms from Intel to Unix to Mainframe technologies with the Internet.

Business-Object Based Applications - Webinar By Brad Friedlander


Overview: This webinar will guide the participant to understand the potential value of using business objects. It will help the participant improve the manner in which they design and architect applications that use business objects. 

Why should you attend: It is "well-known" that applications should be created based on business objects. What is not well-know is what constitutes a good business object and how you should approach the creation and usage of the business object.

Areas Covered in the Session:
  • Why use business objects?
  • What are business objects?
  • What is the promise of business objects?
  • What constitutes a good business object design?
  • What is a good business object architecture?

Who Will Benefit:
  • Architects
  • Application Designers
  • Developers
  • Business Analysts
Click here to know more about : 
Business-Object Based Applications

When and How to Build Private and Hybrid Clouds - Webinar By EITAGlobal

Overview: Many organizations have upward of 75% of their production servers virtualized. The results are easy to see: fewer, better utilized servers and lower data center costs. The next logical evolution for these companies is the private cloud. Private IaaS clouds are highly standardized, automated, virtual pools of compute, storage, and network resources. The virtual resources could be deployed via self-service portals by developers, shared across business units, and metered for pay-per-use chargeback. 

This presentation will show what the requirements are and how to approach a private cloud implementation. We will discuss the essential components of a private cloud: self-service that let’s authorized users select from a number of deployment options, automated provisioning, resource management to control demand and supply, as well as accounting for service consumption. The webinar will then discuss another model that is starting to gain some traction: the hybrid cloud, where a private and a public cloud are combined to meet different application requirements. We will highlight the typical use cases of hybrid clouds and some of the inherent challenges. The presentation closes with a high level cookbook for the selection of a cloud deployment model that fits your organization.

Why Should you Attend: Private cloud is a big ticket item that is on the radar for most Fortune 2000 companies. Its premises are appealing - get the best of both worlds, the flexibility of a public cloud and the control over your own infrastructure. However, Forrester found that most enterprises haven’t matured their cloud management practices to the point where they could fully exploit the private cloud.

There are also significant risk factors: often the implementation costs are under estimated (we already have the hardware, right?) as well as the management complexity of a private cloud, and the in-house staff often does not have the skills to successfully operate the cloud.

While the hybrid cloud sounds good in all the vendor literature, in practice there can be serious roadblocks for an efficient implementation: what latency can you tolerate in a distributed workload, how do you maintain the integrity of a system of record when data gets replicated, to name a few. All these questions require a careful evaluation to determine the best approach for a particular IT organization and application requirements: public, private, hybrid cloud, or no cloud?

Areas Covered in the Session:

  • Private cloud: drivers & challenges
  • Defining the requirements
  • What are the top products to build a private cloud?
  • Use cases and limitations of hybrid clouds
  • How to select a cloud deployment model

Who Will Benefit:

  • Architect
  • Enterprise Architect
  • Development Manager
  • IT Manager
  • Director of Technology
  • Chief Technology Officer
  • Consultant

Contact: 
James Richard
Phone: +1-800-447-9407
Email : webinars@eitaglobal.com/support@eitaglobal.com

Big Data Roadmap for the Relational Database Professional - Webinar By EITAGlobal

Overview: Big Data is an industry meme that is gaining traction and cannot be ignored if you wish to continue pursuing a data management career. But what is Big Data? Does it differ greatly from Oracle, SQL Server, DB2, and other relational database systems? And if so, how? This session will provide a roadmap to Big Data terminology, use cases, and technology. Attend this session to wade through the hype and start your journey toward discovering what Big Data is, and what it can do for you and your company. 

Why should you attend: Big Data and analytics is the current trend in database management systems and data processing. The requirements, methods, and practices for managing big data differ from traditional database management best practices and today's professional can be left in the dust without an understanding and appreciation for the new techniques, methods, and systems being deployed for big data analytics. Although the term Big Data seems straightforward enough it means more than just having a lot of data. You need to know how it contrasts with traditional DBMS. You need to know the use cases for Big Data applications. You need to know the new technology being used to store and manage the data… and to glean insight from the data. And you need to understand how to differentiate the database administration and development practices for relational databases from the NoSQL, Big Data systems that are increasingly being used for Big Data implementations. Failure to keep up with the new can render your skills obsolete in the new age of Big Data.

Areas Covered in the Session:

  • Gain a working knowledge and definition of Big Data (beyond the simple three V's definition)
  • Break down and understand the often confusing terminology within the realm of Big Data (e.g. polyglot persistence)
  • Examine the four predominant NoSQL database systems used in Big Data implementations (graph, key/value, column, and document)
  • Learn some of the major differences between Big Data/NoSQL implementations vis-a-vis traditional transaction processing
  • Discover the primary use cases for Big Data and NoSQL versus relational databases

Who Will Benefit:

  • DBA
  • DBA Manager
  • Programmer/analyst
  • IT Architect
  • Data analyst
Speaker Profile: Craig Mullins
is president and principal consultant of Mullins Consulting, Inc., a principal with SoftwareOnZ (a mainframe software distributor), and the publisher/editor for TheDatabaseSite.com.

Craig has over three decades of experience in all facets of database systems development and has worked with DB2 since V1. You may know Craig from his popular books: "DB2 Developer's Guide, 6th edition," which contains more than 1500 pages of in-depth technical information on DB2 for z/OS and "Database Administration: The Complete Guide to DBA Practices and Procedures, 2nd edition," the industry's only comprehensive guide to heterogeneous database administration.

Tuesday, March 11, 2014

6-Hour Virtual Seminar on Creating Hybrid Public - Private Cloud Applications With .NET and Windows Azure Service Bus - Webinar By EITAGlobal

/images/speakers/34232/michae_large.jpgWhy should you attend: Public-cloud solutions have demonstrated an ability to cost effectively provide scalable solutions to users of an application without the need for extensive capital investment in hardware. Unfortunately it is often the case that code and data that is written for many business applications is not easily adaptable to moving to the cloud, and it may be that case that it is not possible due to compliance issues or actual laws preventing sensitive data from being stored outside of the corporate firewall. This unfortunately leaves existing service and data assets that would be beneficial to cloud applications left behind the corporate firewall where they are difficult, if not normally possible, to participate in cloud based services. This webinar will demonstrate how you can use the Windows Azure Service Bus, and specifically Service Bus Relay services, to easily and securely integrate data and services behind the corporate firewall to public cloud and mobile applications.

Areas Covered in the Session:
  • Introduction to Service Bus and hybrid cloud applications
  • Overview and installation of tools for service bus development
  • Messaging with the Service Bus: Queues, Topics and Relays
  • Creating and managing service bus accounts with the management portal
  • Creating a .NET application using Service Bus Queues
  • Creating a hybrid cloud application with the Service Bus Relay
  • Using Service Bus to Enable the Internet of Things
  • Mobile connectivity with iOS, Android, HTML and Windows Phone
  • Securing access to service bus end points
  • Service bus pricing models
  • Review and Q&A

Who Will Benefit:
  • CTO
  • CIO
  • IT VP and Development Managers
  • Application / Software Architects
  • Application / Software Developers and Engineers
Speaker Profile:

has a almost 30 years of professional software development experience, where he as focused on Microsoft based technologies across multiple verticals including media, finance, energy and healthcare. He holds a masters degree in Mathematics and Computer Science from Drexel University, and a Masters of Technology Management from the University of Pennsylvania. He currently is focused on creating applications that utilize high concurrency, cloud services, messaging, computer vision, natural user interfaces to provide seamless application access as users move through different environments and geographies. As a teacher and speaker he is a common speaker at .NET users groups and conferences, author of technology papers and books, and is former adjunct faculty for the University of Denver and the University of Phoenix where he taught Computer Information Systems and Telecommunications Technology courses.

Virtual Seminar on ECM Integration with Microsoft PowerShell By EITAGlobal

/images/speakers/34198/dave_large.jpg
Instructor  Dave Kinchlea
Product Id  300108
Overview: ECM suites often provide the ability to direct address and solve many business problems directly, however there are many more problems that cannot be directly solved by an ECM suite and, instead, require some 3rd party solution. 

When native or reasonably inexpensive integrations between these solutions and the ECM repositories where the content should reside do not exist, then there are only a few options available:
  • Do nothing and leave the content unmanaged, unknown, and at risk.
  • Create bespoke integration solutions writing whatever modules, functions or code is required to accomplish the required integration. This option is almost always the most expensive, usually exceeding any existing 3rd party solution. It means long-term development, support, maintenance, and training. While this approach can lead to a superior solution, it is one that will almost always take longer than expected and be over budget.
  • Work locally and force the users to change the way they work, manually storing and retrieving the content from the ECM system using other tools like, for instance, WebDav or Windows Explorer. For some problems, this is the ideal solution when there are only a few users requiring the integration. However there is on-going training required to assure the integration over the long-term and there is a real risk that the ECM system may not get used 100% of the time.
  • Work locally and use scheduled tasks to synchronize content between local storage and ECM solutions. This loose integration approach requires almost zero training, limited development and long-term support and is easily maintained over time. The difficulty is in creating the synchronization solution; it too can often turn into a full-blown be-spoke application.

Areas Covered in the Session:
  • Problem space - a discussion as to where CLI and scripting fit within an enterprise
  • Powershell description - a look into relevant portions of powershell
  • ECM Integration - the meaning of; the choices available along with the tradeoffs associated with each type of integration
  • MS Integration - the meaning of; some examples of
  • File System Archiving / Synchronization example Integration Solution
  • Workflow Word Robot example Integration Solution
  • ECM content migration (ETL) example Integration Solution - SharePoint to Content Server
  • Automation / Task scheduling
  • Powershell / ECM Error Handling, Pitfalls and Gotchas

Who Will Benefit:
  • Enterprise Solution Architects
  • Application Integrators
  • System Administrators of all levels
  • Application & desktop support (site administrators)
Speaker Profile: 
Dave Kinchlea

Effective Portfolio Management - Webinar By EITAGlobal

/images/speakers/34299/fabienne_large.jpg
Instructor  Fabienne Fayad
Product Id  300084
Overview: This webinar is about Effective Portfolio Management. The objective is to help insure that you prioritize and select the right project for your strategic objective so you can manage this portfolio with agility and flexibility in order to adapt to "real life" while delivering the results as required. Are you funding the right projects? Is your staff working on the right task? Are you getting maximum value with the investments you have? Effective Portfolio Management is about getting the highest value of your investments while delivering the best value in accordance with the vision, principles and organization priorities. 

What is portfolio management?
Portfolio Management is the art and science of making decisions about investment mix and policy, matching investments to objectives, asset allocation for individuals and institutions, and balancing risk against performance. Portfolio management is all about strengths, weaknesses, opportunities and threats in the choice and management of projects. The end objective is to maximize return on investment (qualitative and quantitative) at a given level of risk. An IT Project Portfolio is a collection of IT projects, both proposed and in progress which include some deployed systems (already in production) within your organization. Effective Portfolio Management addresses the decision process, in order to show how it is possible to minimize the risks and how to focus on the right project that addresses the right priorities of the organization Your IT portfolio should be a diversified mix of high risk/high reward and low risk/low reward elements. Just like a personal investment portfolio, you should look at it in terms of return on investment and addressing the right objectives. 

Portfolio creation:
  • How and where to start?
  • How to do a Yearly Planning?
  • How to select projects according to vision and business priorities?
  • Why you should develop project filtering and ranking criteria?
  • How to insure that projects will address real business needs and priorities?
  • How to prepare requirements in order to get project proposals?
  • How to request for an internal project proposal?
  • How to work with project managers on preparing a Business Case to present the project in a proposal?
  • How to evaluate the financials?
  • How to evaluate the project proposed and provide an independent opinion?

Portfolio Management:
  • How to continuously monitor the results and take the appropriate action.
  • How to manage change without impacting the project.
  • How to adapt the portfolio to the needs without changing all the time. How to put in place the right level of management required for a specific project size and importance.

How to avoid an "unhealthy" portfolio like these:
  • Projects undertaken are not aligned with the organization strategic direction.
  • Decisions are based on uncertain, evolving, conflicting and incomplete information without having first built contingency or analyzed the risks.
  • Project Portfolio is generally developed on uncertain and changing information; however there is a way to improve the decision making and to plan for success.

Why should you attend: You should attend this Webinar if you are a manager, a project manager or an IT internal client; if you are currently managing a project or you are managing the money invested in a project and you would like to improve your portfolio decision process and also improve the return made on investments (time, money and resources invested in projects).

Areas Covered in the Session:
  • Why do organizations need Portfolio Management?
  • How to improve your Portfolio Management?
  • How to create a Portfolio - minded culture?
  • How to select and implement appropriate Tools and Practices?
  • Overview of the process of:
    • Yearly Planning
    • Portfolio monitoring and adjustment
  • Some examples and best practices

Who Will Benefit:
  • Business Manager
  • Project Managers
  • IT Directors
  • Project Sponsors
  • IT Executives
Speaker Profile: 
has over 25 years of experience in the field of IT, including 18 years in management consulting and project management. She possesses degrees both in engineering and in executive management. Over the years, she has gained experience at many international consulting firms: CGI, Deloitte, LGS/IBM, and Systemhouse. Today, Fabienne Fayad is a partner at EDGN CONSULTANTS INC., offering strategic consulting services and training.