Dell Case Study: Circular Economy and Closed-Loop Ecosystems

The case study discusses Dell’s approach through the use of circular economy and closed-loop ecosystems to be responsible on e-waste disposal. One of the main strategies to elevate productivity for Dell and its customers is the attempt to develop a sustainable supply chain. For starters, electronic waste (e-waste) is when electronic products or devices come to the end of their “useful life.” Many of the electronics items found in the workplace or at home can be recycled such as computers, laptops, printers, cell phones and many more (Mayer metals, 2018).

Due to tremendous growth in demands on electrical and electronic equipment and disposal after use, e-waste is becoming a global issue. Disposed e-waste can sometimes cause danger to health and the environment if not handled properly (Department of environment Malaysia, 2018). Dell has taken constructive actions by embracing a total lifecycle approach to improve its plastic production, use, and disposal.

In general, the product design of the company emphasizes the simplicity of repairing and recycling from the beginning.

Dell also actively finds ways to incorporate green materials into goods and packaging, such as recycled plastic as a measure to maintain their sustainability supply chain objectives. The closed-loop program is a transition from “take, make, dispose” now uses recycled plastics to produce almost 5,000 tons of new parts for over 90 items across millions of units. It takes just under six months to complete the cycle from the moment the equipment is collected for reuse to the time the plastics are back in the hands of a customer as part of a new product.

It can be said that the program and initiative made by Dell to go for sustainable supply chain paid off as they managed to reduce its dependence on environmentally new virgin materials which is costly due to fluctuating price on fossil fuels and they also managed to achieve in reducing their carbon footprint at 11% compared using new virgin materials.

Dell faces challenges in making its closed-loop supply chain reliable. Dell constantly needs a vast volume of product in their line ups to be able to utilize the recycled plastics. On the other hand, Dell also faces issues where the plastic they recycle is getting less per item because through new innovations and advancement of technology, electronic devices are becoming smaller and dell needs to accept or buy a bigger volume of recycling goods to be able to produce their products. In regards to this issue, Dell will need to increase its participation in its takeback program globally and tends to incur additional cost to meet the additional volume of recyclable products required.

Identifying the types of plastics is one of the main challenges Dell faced which is the most suitable ones are only picked to develop its product as recyclable items are made from a lot of different raw materials. They stated that they need to overcome those issues with engineering “know how”. To deter this hurdle, Dell has been collaborating with collaborators to explore various approaches to solve their issues effectively. Testing found that the combination of recycled content with virgin plastic produces the best results due to mechanical and esthetic considerations.

Dell’s next hurdle is to show consumers the value of closed-loop recycling. After all, these products look and work the same as those manufactured from new materials. Dell needs to articulate the value proposition to consumers by emphasizing the amount of recycled content in the final product, the essence of the closed-loop products and the benefits to the sustainability objectives of the customers themselves. A proper investment awareness campaign is required to be able to get mass attention and explain to customers that when they buy Dell products, they are actually helping the environment bits by bits.

Another further concern is the handling of products. Dell customers are all over the world, which means take-back initiatives need to adapt at a global scale. Products need to be processed in reasonably large quantities to justify the economic and environmental costs of shipping to a centralized hub. It covers issues such as infrastructure, legislation, and others. Closed-border regulation, for example, inhibits the transport of electronic waste in Europe and currently makes it impossible for Dell to set up a branch of its closed-loop supply chain there.

Dell has recovered nearly 800 million kilograms of used electronics and has produced nearly 5,000 tons of plastics from recycled computer parts since Dell launched the closed-loop plastic recycling project in mid-2014. Dell has saved over USD 1 million from this process, and Dell has managed to reduce it carbon footprint by 11% which is much lower compared when using virgin plastics during manufacturing. It can be said that Dell is currently using circular plastics across millions of units worldwide in around 90 products.

Furthermore, in order to understand the benefits of moving away from new plastics, Dell did an analysis. One of the most useful ways Dell did was they were able to measure their activities like environmental impacts, internal operations, upstream supply chain, and downstream use and disposal of products and turned those impacts into monetary values. Compared to traditional plastic, Dell measured the net benefit of closed-loop plastic for environmental impacts. Next, they valued the net environmental benefit in terms of natural capital and natural resource stock that makes human life possible and on which companies rely to produce goods and services. Such advantages were then applied to broader applications, including the use of closed-loop plastic across many of Dell’s product lines. Due to the effective measure taken by dell, findings have shown that closed-loop plastic from Dell has an environmental benefit of 44 percent higher than virgin plastic. Improved reuse of technological devices in general reduced environmental impacts.

The lesson that can be learned based on this case study is that business needs to be aware of the current surroundings that are happening especially related to environmental. Over the past few decades, our planet’s pollution and our environment’s destruction have risen at an alarming rate. We have seen natural disasters hitting us more often in the form of flash floods, tsunamis and cyclones, because our actions were not in favor of protecting this planet. In the rapid technological advancement, customers are now well informed and this move has created a constantly shrinking window for businesses to connect with their customers and clients (Julie Brown,2018). With the rise of social media, awareness can be spread easily and it is an advantage for businesses to connect and create a bond with their clients through that medium.

Furthermore, dell has taken the risk on trying new inventions in terms of its sustainable supply chain where chances of failure can occur but they have conducted the change on its supply chain where their interest is aligned with most people nowadays where environmental is a sensitive issue and governments are now implementing carbon footprint in order to combat environmental issues. It can be said Dell has made the right decision to go green at the right time and has succeeded in creating more skilled jobs, growing to manufacture and improving the local economy. It can be said that innovative solutions have the ability to save businesses and consumers on a global scale both economically and environmentally.

 

Database Migration Or The Act Of Moving Computer Science Essay

DB2s cost theoretical account, a migration consists of two parts. First is a direct interlingual rendition, it migrate codification from the beginning spirit SQL or procedural logic to the mark. This interlingual rendition is a rigorous syntactic interlingual rendition of the codification. For illustration, to rewrite for cringle because the mark system uses a different sentence structure than the beginning would be called a interlingual rendition. The 2nd portion is the execution of workarounds for incompatible cases that have been identified by the analyser.

A workaround is when the mark database does non back up certain characteristics used in the beginning system, and the migration squad would necessitate to re-implement portion of the codification that uses these characteristics and bring forth new codification for the mark system that is of equal semantic behaviour. For illustration, calls to a proprietary natural linguistic communication processing map would necessitate a workaround. The costs of implementing workarounds are stored in the cognition base. Migration costs vary significantly depending on an extended list of factors, most notably the experience degree of the migration squad every bit good as the tools used in the migration procedure.

For illustration, some tools can assist with the direct interlingual rendition between different spirits of SQL, e.g. IBM ‘s Migration Toolkit, and the use of these tools can significantly cut down the cost.

Migration tools besides do n’t hold human intelligence to make up one’s mind that merely

change overing sentence structure to syntax may non be the best acting consequence either.

So, even after the tool converts the codification, manual intercession is

necessary. These tools besides ca n’t vouch the born-again codification will

really execute the same manner as the original codification and therefore potentially

return wrong consequences!

In add-on, these migration tools typically focus on the waiter / database

side and wholly disregard the application and any embedded SQL or

calls to hive away maps. This becomes the client ‘s duty to

analyze and manually modify the application logic.

“ Data migration solutions extract informations from a beginning system, right mistakes, reformat, restructure and load the information into a replacing mark system ” .

Data migration is non a occupation to be taken casually. The information is an vastly valuable plus, built up over old ages of operations. The whole replacing undertaking relies on successful migration. If the migration undertaking runs into jobs, the hereafter of your company may be at interest.

Data Migration is a field that is frequently overlooked as a simple measure between retiring an bing platform and delivery to life a new one.

The existent undertaking of informations migration is a complex one necessitating experience and a elaborate apprehension of the stairss to be taken, both in footings of concern procedure and proficient execution. This includes a clear word picture of duty between IT and concern, since far excessively frequently the occupation of pull stringsing informations is relegated to IT, good outside of their nucleus competence. Enabling the migration requires a squad experienced in abstracting the proficient inside informations such that the concern squad can implement their regulations for migration without of all time composing a individual line of codification, at ask for IT.

Data migration requires human intercession. From planning and alteration control through the informations motion and waiter or application restart procedure, there are several degrees of staff labour and attempt.

This subdivision highlights the frequently boring methods of planning, transcript, move and confirmation undertakings that are required to be completed by the IT staff and/or external consulting resources.

“ Complete the alteration without anybody in the concern recognizing that it has happened ”

– Rob Dagley, CTO, Global Technology Services, IBM

There are many challenges associated with informations migration. First, the sum of informations to be migrated is increasing every bit rapidly as storage capacity grows. Second, informations migration is debatable for most users.

As storage capacity continues to turn exponentially, it ‘s of import to reassess the advantages and costs of informations migration tools.

Migrating from one database to another is frequently a bad, expensive, and time-consuming procedure.

Migrating from one database platform can be hard and clip devouring due to differences in criterions between sellers.

Data migration is the procedure of reassigning informations between storage types, formats, or computing machine systems. Data migration is normally performed programmatically to accomplish an machine-controlled migration, liberating up human resources from boring undertakings. It is required when organisations or persons change computing machine systems or upgrade to new systems, or when systems merge ( such as when the organisations that use them undergo a amalgamation or coup d’etat ) . ( Wiki 20/03/2013 )

1 Introduction

General Introduction

An independent package seller ( ISV ) is an single or concern that builds, develops and sells consumer or endeavor package. Although ISV-provided package is consumed by terminal users, it remains the belongings of the seller. ISVs frequently specialize in edifice applications for a specific niche or concern perpendicular, such as fiscal, selling and educational package.

In short under independent package seller ( ISV ) , company is doing, selling or geting package. As a portion of IBM ‘s strategic concern portfolio betterment, this industry giant maintain on geting new merchandises like OpenPages, i2, Sterling commercialism, UNICA Campaign and PSS Atlas etc. Such application merchandises typically rely on Relational Database Management System ( RDBMS ) merchandises as their backend informations waiter.

IBM DB2 is a relational theoretical account database waiter developed by IBM. There are three chief merchandises in the DB2 family-

DB2 for LUW ( Linux, Unix, and Windows ) ,

DB2 for z/OS ( mainframe ) ,

DB2 for iSeries ( once OS/400 ) .

DB2 UDB is a database leader in several engineerings, and offers true multi-platform support and scalability. The same database is able to blend work loads on a individual waiter. The DB2 UDB design handles work loads from high-volume on-line dealing processing ( OLTP ) to complex multi-user questions while keeping first-class public presentation. DB2 UDB is a true cross-platform relational database direction system ( RDBMS ) , running on a broad assortment of systems including Windows, Solaris, HP-UX, AIXA® , and Linux. IBM is perusing assorted undertakings to incorporate their internal ISV merchandises with DB2. There are six ground to migration database public presentation, cost economy, serviceability, scalability, dependability and efficiency from beginning to IBM DB2

Recently IBM owned a package company Tealeaf which is a Customer Experience Management ( CEM ) company. Tealeaf used Microsoft SQL Server database. So, Enable IBM ‘s internal Independent Software Vendor ( ISV ) merchandises on DB2.

Data migration is a subset of the informations integrating market. Data Migration is the procedure of reassigning informations from one system to another while altering the storage, database or application. Typically data migration occurs during an ascent of bing hardware or transportation to a wholly new system. Examples include: migration to or from hardware platform ; upgrading a database or migrating to new package ; or company-mergers when the parallel systems in the two companies need to be merged into one. There are three chief options to carry through informations migration:

Merge the systems from the two companies into a trade name new one

Migrate one of the systems to the other one.

Leave the systems as they are but create a common position on top of them – a information warehouse.

Data migration is a cardinal component to see when following any new system, either through purchase or new development. The procedure of interpreting informations from one format to another is a migration.

Data migration is necessary when an organisation decides to utilize a new calculating systems or database direction system that is incompatible with the current system.

Data migration is the procedure of making a transcript of informations from one platform to another without interrupting the running applications. Enablement means change overing or seting the application as per the demand of database. Integration is nil but the combination of both informations migration and enablement. Tealeaf company package is holding Microsoft SQL Server database which is integrated into IBM DB2. In integrating the database is move from MS SQL Server to IBM DB2 waiter ( Database Migration ) and harmonizing to it there may be alterations in application ( Enablement ) .

Figure 1.1: Typical Migration Architecture

Figure 1.1 shows typical Database migration architecture which contains a beginning database is migrated into mark database. In this undertaking beginning database is Microsoft SQL Server migrated to aim database as an IBM DB2 ( as shown in Figure 1.4 ) . The migration can be done by utilizing different tools such as a Database Migration Toolkit which is vendor dependant. Tool convert database books, tabular arraies, stored process, triggers, maps and many objects from beginning to aim database. The migration tool is migrate non merely database objects but besides informations.

The undermentioned Figure 1.2 show existent working of Internal ISV – DB2 Integration squad.

The integrating squad works on

Appraisal

Database scheme migration

Resolve issues / Answer questions.

ISV squad work on other portion such as unit trial, QA coverage, public presentation trial and installing.

Figure 1.2: Internal ISV- DB2 Integration / Enablement stairss.

Appraisal

The appraisal begins with roll uping information about both the beginning and mark environments. Assessment is rating or appraisal of the nature, quality, or ability of person or something. Assessment average calculating or evaluating clip and cost require for migration. It can be evaluated by utilizing following points: –

Accessing the environments

In accessing the environments foremost gather all the information about application and beginning database every bit good as mark database and so name different software/drivers require to link to the databases. So this stairss gives the cost and clip require to entree the environments.

Tools

Different types of tools present in the market for appraisal intent which is vendor dependant. Tools carried out specific undertaking and on that footing cipher the appraisal. IBM uses tools for assessment intent know as Migration Evaluation and Enablement Tool ( MEET ) . It is used for automatically rating intent which gives information about how much database objects is straight migrated ( compatible objects ) and the objects which is non straight migrated ( incompatible objects ) .

Database scheme Migration

Figure 1.3: Database Migration.

Database migration is instead consecutive frontward, presuming the database is used merely as storage. It “ merely ” requires traveling the information from one database to another. However, even this may be a hard undertaking. The chief issues one may meet include:

Unmatched informations types ( figure, day of the month, sub-records )

Different character sets ( encoding )

Different informations types can be handled easy by come closing the closest type from the mark database to keep informations unity. If a beginning database supports complex informations formats ( e.g. sub-record ) , but the mark database does non, amending the applications utilizing the database is necessary. Similarly, if the beginning database supports different encoding in each column for a peculiar tabular array but the mark database does non, the applications utilizing the database demand to be exhaustively reviewed.

When a database is used non merely as informations storage, but besides to stand for concern logic in the signifier of stored processs and triggers, close attending must be paid when executing a feasibleness survey of the migration to aim database. Again, if the mark database does non back up some of the characteristics, alterations may necessitate to be implemented by applications or by middleware package.

Data migration is the procedure of doing an exact transcript of an organisation ‘s current information from one device to another device, sooner without interrupting or disenabling active applications and so airting all input/output ( I/O ) activity to the new device.

IBM has series of tools to transport out this migration,

IBM Data Movement Tool ( IDMT )

IBM Migration Toolkit ( MTK )

By utilizing this tools migration is done for compatible objects and staying incompatible objects is migrated by utilizing DB2 Command Line- Processor ( CLP ) i.e. by manually. Migrated database is sent to ISV squad for carried out following stairss. Team accomplishes the unit trial, QA coverage and public presentation trial. During this procedure if they get any issues / questions sing to database migration is send back to DB2 migration squad.

Resolve issues / Answer questions

Integration /Enablement squad will work out the issues given by ISV squad. After work outing the issues there is once more different types of trial is carried out to look into the working of application.

Figure 1.5 shows the flow of migration procedure. It contains following different procedure

Access the Migration Project: – The procedure is come under the assessment portion of an integrating squad.

Migrate the Schema & A ; Datas: – Migration of MS SQL Server database scheme and informations into IBM DB2 by utilizing different tools of IBM.

Migrate the Business Logic: – As per demand of database brand alterations into the Business logic and migrate it. In SQL waiter database had several schemes which are migrating in DB2 by doing three scheme.

Test the born-again database: – The migrated database must be trial to look into whether it is working decently or non.

Migrate Application: – After successful database migration, application migration should be done.

Test, integrate & A ; deploy: – Integrate the migrated application with mark database and so carried out trial and deployment.

Optimize the public presentation: – IBM holding different tools used to optimise the public presentation.

Figure 1.4: Migration procedure flow.

Figure 1.5: DB2 Integration Details Process Working.

Integration can be highly hard for a legion grounds. First, databases frequently store the same information in different ways, sometimes drastically so. Maping between these differences can be clip devouring, particularly when guaranting that the information transportations back and away right. Second, informations is frequently generated by third-party plans and so inserted into the database. When you integrate systems, you need to be certain you are non interrupting those third-party systems ‘ communicating procedures. Last, databases are frequently stored on different computing machines on different webs so maintaining the informations secure can be slippery.

Data integrating is a term covering several distinguishable sub-areas such as:

Datas warehousing

Data migration

Data integrating involves uniting informations from several disparate beginnings, which are stored utilizing assorted engineerings and supply a incorporate position of the information. Data integrating becomes progressively of import in instances of unifying systems of two companies or consolidating applications within one company to supply a incorporate position of the company ‘s informations assets. The ulterior enterprise is frequently called a information warehouse.

Here the undertaking is working on 2nd portion where information migration is chief thing.

Server. You can besides utilize the

IBM DB2 Migration Toolkit ( MTK ) to recover structural and object information

from a SQL Server database.

 

Developing Mobile Enterprise Application Computer Science Essay

Enterprise Mobility is the ability of an endeavor to link to people and command assets from any location. Enterprise mobility includes engineerings like radio webs, nomadic applications, middleware, devices, and security and direction package. It encompasses everything from the integrating of cell phones into an endeavor system to vertically orientated solutions affecting the speedy bringing of productiveness, heightening information to people in the field, the mill, the warehouse, at hard currency registries and at patients ‘ bedsides.

With Enterprise Mobility in manus, an organisation can profit from improved productiveness, increased client satisfaction, fastened communicating and coaction between employee, client and provider.

The ability to work off from the office and on the move is a defining characteristic of modern concern. No endeavor trusting to stay competitory can disregard the fact that people live in an progressively nomadic universe. From the globetrotting CEO to the nomadic field applied scientist, portable devices such as nomadic phones, laptops and personal digital helpers ( PDAs ) have radically changed the manner people work.

Enterprise Mobility will ensue in significant addition in productiveness ; will supply entree to concern critical information. It is a manner to provide to the work force demands and wants to interact with clients, employees, information, assets and other concerns as and when it chooses.

Enterprise Mobility is deriving credence in Europe, US, Japan, etc.

Apart from basic electronic mail, contact and calendar services being mobilized earlier, the tendency has been changed towards mobilising enterprise assets like CRM, ERP, SCM, and BI etc. More than 80 % of endeavors in the U.

S. , U.K. , France and Germany surveyed by Gartner in 2010 have remote workers, numbering to 50 million nomadic workers worldwide.

Enterprise Mobile Application Development comprises of three development attacks viz. Native Development, Cross Platform Development and Mobile Web Development. Native Development Approach contains nomadic application development for a individual platform like Android, Io, Nokia Qt etc. and is characterized by full device integrating and support. However, with multiple nomadic platforms available today, it is non executable for a SMB to host different squads for each nomadic OS platform. Thus Cross Platform Development plays a important function today, utilizing which a developer could do application and cross compile it into several platform supported application. Mobile Web Application is characterized as an entry-level attack into an organisation ‘s mobility scheme with restriction of no offline entree.

Examples for different development platforms are enlisted below:

Native Development: Android SDK, Io SDK, Windows Phone 7, Nokia Qt SDK

Cross Platform Development: Rhomobile, Phonegap, Titanium Appcelerator

Mobile Web Development: HTML 5, Doodads, WebKit, Sencha Framework

Given three attacks for application development, it is wholly dependent upon undertaking being developed and on developer ‘s context. However an organisation can non overlook the benefits of developing natively, upon which it can accomplish high public presentation delivering and bester incorporating application to user ‘s device.

Unlike Native Development Approach, Cross Platform and Mobile Web Development fail to embrace several critical characteristics of independent development platforms like device ‘s contact, camera or geo location entree or leveraging each device ‘s alone set of UI tools and doodads. They besides suffer terrible public presentation hit, unlike native development therefore taking the organisation ‘s development attack compromised.

Using Native Development Approach, an organisation can drive deep down to the thread degree scheduling, capableness to manage application life rhythm activity, handiness of local storage mechanism and several other benefits. In context to my undertaking for developing, comparing and measuring two humanoids based development platforms, it is imperative that I will utilize native development attack which will offer me greater flexibleness to function the demands and thorough device integrating

In early start of twelvemonth 2010, Google announced debut of a DIY tool called Google AppInventor, utilizing which users can make native humanoid applications utilizing Drag & A ; Drop functionality. Based on Scratch ( graphical programming concept of MIT Labs ) , Google AppInventor has become a Ocular Basic of Android Development Platform.

Google AppInventor is poised to be the game Turner in the nomadic application development field with capablenesss of leveraging endeavor development tools with its broad set of available constituents like Web Service Integration, local SQL Lite informations storage, remote destruction and locking etc..

The undertaking is centric to rating of native application development method for android utilizing Android SDK and Google AppInventor with the aid of a instance survey based on endeavor degree context. The exclusive intent of comparing the application development utilizing GAI and Android SDK is to measure the capableness of Google AppInventor in developing nomadic endeavor application. This will assist mobile developer squads at Patni Computers to calculate the capableness of GAI in doing endeavor nomadic application by leveraging Google AppInventor for fast paced development.

Google AppInventor non merely helps developer to utilize ocular scheduling manner and drag-drop manner but besides helps to minimise clip spent on larning each and every concept that native humanoid SDK offers. The constituents available ease the attempt of developing complex and characteristic rich application.

The attempt while developing precisely same instance survey utilizing native android SDK escalates exponentially, if the developer is naif in the field humanoid nomadic application development. It involves covering with each and every facet of the application being developed which was implicitly managed in GAI.

To develop the comparing between GAI and Android SDK, we have taken instance survey of Employee Directory which encapsulates every facets of a typical nomadic endeavor application like distant entree utilizing web service, offline manner, greater public presentation, security etc.

Employee Directory is an application being developed for the employees of Patni Computers, utilizing which they can seek for any internal forces or group of employee in a Standard Business Unit, anyplace and whenever to eliminate manual storing of each contact of their needed co-workers into their Mobile ‘s Phone Book and assist them seek the contact inside informations of the needed co-workers on the move. These will assist non merely easier hunt and happen options to the employees but besides render them helpful for non holding to hive away immense sum of members and their contact inside informations which are required merely at given point of clip, given the dynamic nature of an organisation ‘s work processes.

The user would be able to seek for an employee in an organisation with his name along with options to shop employee register utilizing SBU or Designation. Along with seeking for an employee, a user can besides name, electronic mail, SMS from within the application.

To understand huge Numberss of employees in peculiar organisation, a concept to hive away inside informations of the employee late searched and accessed via the app, in local MySQL Lite database for the easiness of accessing rapidly those who were queried late and often.

Along with this, the undertaking deliverable will besides embrace a characteristic to users to update their nomadic Numberss ( in instance of alterations ) , profile and other contact inside informations which is a most normally encountered state of affairs in any endeavor today. Care would besides be taken to synchronise changed or updated contact inside informations of an employee into the locally stored Recent Logs Database of a user.

The intent of the undertaking is to happen out if Google AppInventor can be used in endeavor development infinite to provide to mobility demands of the organisations and stand for the competitory and complex side of the nomadic application development universe.

The system will ease enterprise-wide employee register which will enable to entree to Enterprise Address Book

The system will heighten the mobility resource of the organisation supplying the nomadic fleet of the organisation an excess border in the signifier of public presentation and handiness.

Study Impact of Mobility on Business Procedures

The propose system will besides integrate method for locking and pass overing out informations from unbarred nomadic client and therefore guarantee security to the organisation over Mobility issues

The System will besides function as medium to acquire the latest or updated alterations in the contact information of an employee

The futuristic ends may embrace Enterprise Chat Service or IM client to ease unrecorded interaction between employees whenever and wherever.

Following are the deliverables at the completion of the undertaking:

Comparative Evaluation of Android SDK and Google AppInventor as Mobile Enterprise Application Development Platform ( MEAP )

PatniTacts ( Employee Directory ) Android Mobile Application utilizing Google AppInventor

PatniTacts ( Employee Directory ) Mobile Application utilizing Android SDK

RESTful Web service utilizing Java JAX_RS Jersey model, to function to client Mobile applications

The range of our Undertaking is bounded by developing a nomadic endeavor app which encompasses the likely complexness associated with developing an existent hardcore endeavor app utilizing two different engineerings with one being sophisticated, user-friendly and easy-go development technique affecting capableness of developing nomadic apps without acquiring into the know-how ‘s and larning confounding programming constructs.

Following are the enlisted undertaking objectives followed throughout the undertaking continuance:

A Android App holding capablenesss of retrieving, hive awaying, updating Employee Contact Details via Remote Enterprise Server

It should consists of same degree of complexnesss as of that associated in developing a Mobile Enterprise Application

Implementing Agile Software Development Methodology throughout the undertaking procedure.

Comparing the differences in planing, developing, proving, and deploying the above described app utilizing Google AppInventor and Standard Android Development Kit.

Measuring the comparing between couple.

Documenting POCs for researching the ability of AppInventor to make or widen its capablenesss encountered while developing the described App.

Futuristic Scope and Increments possible to enrich AppInventor capablenesss for replacing wholly ADK ( doing it disused ) used in an IT organisation today.

Generalize the Application at most as possible to link to most of the web services or database like Sybase, Oracle DB2 or SQL connexions, Google Spreadsheet to be able to deploy efficaciously to other organisations.

The Proposed System will virtualizes Enterprise-like Work Environment picturing same type of Database Semantics and Web Services which will assist us to work the capablenesss of Google AppInventor to its fullest, tackling each and every possibility every bit maximally as possible. Hence, being the system evolved, we will non be linking it to the bing organisation ‘s database engines.

With the aid of this application, organisations can anticipate a return of several times their on-going investing, depending on the size of the web or employees, the figure of users in the directory, and the figure of directories being integrated with the endeavor directory. Organizations can recognize that return in cost nest eggs in the 1000000s of dollars, chiefly in the countries of disposal and support.

The effectual consequence of this application would minimization of hold in communicating apparatus between different members of the organisation. Thus the intended system will profit by increasing professional handiness and reactivity.

The nomadic application development is frequently encountered with multiple happenings of cases like quickly emerging criterions, volatile platforms, varied devices and inconsistent user interfaces and input engineering. Therefore developing nomadic applications consists of several hazardous properties of handiness, security.

Besides, it takes a great length of clip to develop a dependable nomadic solution, frequently developer making battalions of platform or device specific customizations and one-off solutions, ensuing changeless bombardment of spots and updates which is hard to manage in nomadic paradigm.

Therefore, to maintain development cost down and guarantee high quality application, we have approached iterative and incremental bringing methods to maintain up with the rapid gait and changeless alteration inherent in the Enterprise Mobility Industry.

Therefore to maintain up with demands in stage, we have implemented Agile Methodology based Feature Based Development Strategy to plan, construct, trial and document the development of the proposed undertaking.

The undertaking work is divided into different dashs each stipulating development about each characteristic holding the demands, design, physique and trial stages in each of these.

Following are the primary demands of the nomadic application being developed:

Users are non required to remain connected to the web or cellular informations web while they are accessing the Personnel Information and Contact Details.

Attempts to be taken to minimise dependence on the Data Network every bit low as possible because of the frequent and inconsistent web informations handiness.

A mechanism to hive away locally the contact inside informations of those who were accessed old and past times.

Advanced Search features which will enable users to seek members by function or SBU or instead hunt every one of them in a peculiar SBU or function.

Users should besides be able to update their contact information and profile information utilizing the application which should be applied to the employee database or register.

The designing and development of:

Dashs

The assorted dashs of the undertaking are:

Dash 1: Login & A ; Authentication

Dash 2: Question Faculty

Dash 3: Advanced Query Faculty

Dash 4: Consequence Screen Module

Dash 5: Recent Logs Module

Dash 6: Profile

Database

The Database engine implemented here in will be MySQL 5.0 Engine based on PHPMyAdmin GUI tool. The Database will be situated at distant waiter at sphere patni.tk

Web Servicess

The Web Service would be based on PHP Script which will implement a alone API Key which will protect it from being exploited from external universe. The Web Service will dwell of several POST variables as required while planing different dashs.

Backend HR Admin:

To retroflex the functionalities of PeopleSoft Admin Module and to supply employees a installation to alter, add, cancel their profile. HR Admins will besides be provided a characteristic to chair and tag the employees who have left the organisations.

To show these functionalities, we have used Joomla CMS to immediately develop a portal for above activities.

User-Interface:

User Interface of the application will experience same as that of, when a user runs a native humanoid application, with same UI elements.

Web Interface will be utilizing Joomla Front-End Website for administrative plants and for employees to modify, add or cancel functionalities.

Internal Unit Testing: Testing of the different faculty login, question, profile etc. independently.

Integration Testing: Testing of the compatibility of different faculty working decently when integrated together like login faculty registering the Sessionss and airting it to the question faculty.

System Testing: Testing of the whole system from the user logging into the application to user logging out of the application after all faculties of the system integrated.

User Accepted Testing ( UAT ) : Testing of the whole undertaking after completion and integrating of the units together as accepted by the user.

Herein, the developed app will be signed and submitted to the android market place for reappraisal and posting the app to the Android Market.

Further, if any updates, so it will update Over the Air utilizing Android Market.

The first two hebdomad on fall ining the company was utilized in understanding the Enterprise Mobility Standpoint, Ramp Study, Knowledge Transition and Induction Process. Considerable clip of which was besides spent on larning JAVA programming linguistic communication following with Lab Works and Assignments which was pre-requisite beginning with Android Software Development Kit.

Thereafter, a hebdomad was dedicated to the survey of Different Mobile Application Platforms ( MEAPs ) and larning tutorials to get down up with the Google AppInventor Application Development.

The Following chart depicts the agenda for developing the application for Employee Directory utilizing Google AppInventor:

Underneath charts shows the agenda for Second Phase viz. , Native development utilizing Android Software Development Kit:

The Remaining Time period will be utilized while measuring and comparing development utilizing Google AppInventor and Android Software Development Kit. Much of it will besides be spent in heightening the application to include more characteristics viz. , Enterprise Messenger Services and Email.

A generous sum of clip will besides be required to be bestowed upon polishs of the Application to do it more inactive and dependable.

 

Decided To Set Up Mssql Database Computer Science Essay

Puting up an MSSQL database was comparatively easy. The first measure was to download the installer bundle from Microsofts web site. The bundle I chose is the Microsoft SQL Server 2012 Express with Advanced Services. The ground why I chose this bundle is because it contains all the constituents of SQL express, which consists of the database engine, express tools, describing services, and full text hunt. With this bundle I will be well-equipped to continue with my coursework.

Installing the bundle was a comparatively painless and user-friendly matter.

Throughout the installing procedure, I left all the scenes as defaults, and everything installed absolutely without a enlistment.

So in order to dwell the database, I went to http: //www.sqlskills.com/blogs/bobb/post/ mondial-database-for-sql-server-2008.aspx to download the 3 files. The 3 files are:

modial_latlong.sql

mondial-inputs-sqlserver2008.sql

mondial-schema-sqlserver2008.sql

By running and put to deathing these 3 files, I managed to successfully dwell the database.

A2

Database Engine

What ‘s New ( Database Engine )

Handiness Enhancements ( Database Engine )

Describes sweetenings to high handiness characteristics.

Cross-cluster migration of AlwaysOn Availability Groups for OS ascent.

Introduces characteristics of AlwaysOn Availability Groups such as AlwaysOn SQL waiter failover bunch Cases and AlwaysOn Availability Groups tools.

Manageability Enhancements ( Database Engine )

Describes sweetenings to tools and monitoring characteristics.

Enhancements include alternate keyboard cutoff strategies, betterments to question editor, alterations to startup options, contained databases, and data-tier applications, Windows Powershell and Database Engine Tuning Advisor.

List of new and modified Dynamic Management positions and maps.

Programmability Enhancements ( Database Engine )

Describes programmability sweetenings in the Database Engine.

Echancements include FileTables, statistical semantic hunt, property-scoped full-text hunt and customizable propinquity hunt, ad-hoc question paging, round discharge section support for spacial types, support for sequence objects, and default support for 15,000 dividers.

List of 14 new maps and 1 changed map for Transact-SQL.

Scalability and Performance Enhancements ( Database Engine )

Describes scalability and public presentation sweetenings in the Database Engine.

Enhancements include ColumnStore indexes, increased divider support, FILESTREAM Filegroups can incorporate multiple files, and Online Index Create, Rebuild and Drop.

Security Enhancements ( Database Engine )

Describes security sweetenings in the SQL Server Database Engine.

Security sweetenings in the SQL Server Database Engine include purveying during apparatus, new SEARCH PROPERTY LIST permissions, new user-defined waiter functions, default scheme for groups, SQL waiter audit enchancements, database engine entree is allowed though contained databases, hashing algorithms, farther depreciation of RC4, certification cardinal length, service maestro key ( SMK ) and database maestro key ( DMK ) encoding alterations from 3DES to AES, and certifications can be created from double star.

Resource Governor Enhancements ( Database Engine )

Describes Resource Governor sweetenings in the Database Engine.

Enhancements include support for 64 resource pools, greater CPU use control, and resource pool affinity for breakdown of physical resources and predictable resource allotment.

Microsoft.SqlServer.Dac and Microsoft.SqlServer.Dac.Extensions

The Microsoft.SqlServer.Dac namespace provides categories to execute operations on DACPAC and BACPAC bundles.

The Microsoft.SqlServer.Dac.Extensions namespace provides categories incorporating extension methods to recover information from DACPAC and BACPAC bundles or use the expanded functionality non presently present in the Microsoft.SqlServer.Dac namespace.

SQL Server Database Engine Backward Compatibility

Deprecated Database Engine Features in SQL Server 2012

List of characteristics that are still available in SQL Server 2012, but are to be removed in future versions of SQL Server

Discontinued Database Engine Functionality in SQL Server 2012

List of Database Engine features that are no longer available in SQL waiter 2012.

Interrupting Changes to Database Engine Features in SQL Server 2012

List of alterations in SQL Server 2012 Database Engine that may interrupt applications, books, or functionalities that are based on earlier versions of SQL Server.

The list includes alterations to Transact-SQL, Dynamic Management positions, Catalog positions, SQL CLR informations types ( geometery, geographics, and heirarchyid ) , and XQuery maps.

Behavior Changes to Database Engine Features in SQL Server 2012

Behaviour alterations include Metadata find, alterations in behavior in scripting an SQL Server Agent undertaking, changeless folding for CLR user-defined maps and methods, behavioral alterations to STEnvelope ( ) Method with empty spacial types, optional parametric quantity for LOG map, alterations to statistics calculation during partitioned index operations, alterations to informations type transition by the XML value method, sqlcmd.exe behaviour alteration in XML manner, and alteration in behavior of exist ( ) map on XML datatype.

SQL Server Management Tools Backward Compatibility

Deprecated Management Tools Features in SQL Server 2012

List of depreciated Management Tools features that are still available in SQL 2012 but are scheduled to be removed in future releases of SQL Server.

Discontinued Management Tools Features in SQL Server 2012

Lists the SQL Server Management Tools features that are no longer available in SQL Server 2012. These features/components include SQL Server Compact edition, ActiveX subsystem for SQL Server Agent, Net Send and Pager Notification, sp_addtask, sp_deletetask, sp_updatetask, and some data-tier applications.

Interrupting Changes to Management Tools Features in SQL Server 2012

SQL Server 2012 direction tools can non be used to make a public-service corporation control point on SQL Server 2008 R2.

SMO has ben reversioned in SQL Server 2012.

Behavior Changes to Management Tools Features in SQL Server 2012

Other alterations to Management Tools features in this release.

Database Engine Features and Tasks

Database Engine Instances ( SQL Server )

Describes what an case of the Database Engine is.

Explains related undertakings such as configuring Database Engine cases, bite and unicode Support, linked waiters, pull offing the Database Engine Services, waiter web constellation, Database Engine scripting, care programs, resource governor, database mail, extended events, SQL hint, SQL Server profiler, tracking informations alterations, log file spectator, Database Engine Tuning Advisor, diagnostic connexion for Database Administrators, distant waiters, and the Server Broker.

Database Features

Describes characteristics and undertakings associated with databases, database objects, informations types, and the mechanisms used to work with or pull off informations.

Subjects included in this subdivision include databases, tabular arraies, indexes, partitioned tabular arraies and indexes, positions, stored processs, hunt, user-defined maps, statistics, program ushers, pointers, sequence Numberss, DDL triggers, DML triggers, equivalent word, XML information, spacial informations, Binary Large Object Data, data-tier applications, the Transaction Log, database checkpoints, back up and reconstruct of SQL Server databases, bulk import and export of informations, informations compaction, OLE mechanization, event presentment, and proctor and melody for public presentation.

Database Engine Cross-Instance Features

Describes the tools and undertakings associated with managing and monitoring waiters and database cases.

The tools and undertakings described include the SQL Server Management Studio, SQL Server Utility characteristics and undertakings, administer waiters by utilizing policy-based direction, registry waiters, SQL Server Distributed Replay, information aggregation, proctor resource use, administer multiple waiters utilizing cardinal direction waiters, SQL server constellation director, and Activity Monitor.

High Availability Solutions ( SQL Server )

The high-availability options include AlwaysOn Failover Cluster Instances, AlwaysOn Availability Groups, database mirroring, and log transportation.

Security and Protection ( Database Engine ) .

The subjects discusses under security and protection include procuring SQL Server, Principals, server-level functions, database-level functions, certificates, securables, taking an hallmark manner, surface country constellation, TRUSTWORTHY database belongings, watchword policy, strong watchwords, SQL Server encoding, SQL Server certifications and asymmetric keys, and SQL Server audit.

Technical Reference ( Database Engine )

Feature Mention

Subjects include registered waiters F1 aid, SQL Server Management Studio aid, and Visual Database Tools F1 aid.

Command Prompt Utility Reference

Subjects include bcp Utility, dta Utility, and SqlLocalDB Utility.

Database Engine PowerShell Cmdlets

Discusses Invoke-PolicyEvaluation cmdlet and Invoke-Sqlcmd cmdlet.

Mistakes and Events Mention

List of Database Engine events and mistakes.

Showplan Logical and Physical Operators Reference

Describes the usage and execution of logical and physical operators.

Transact-SQL Reference ( Database Engine )

Lists the sorts of applications that can bring forth Transact-SQL.

XQuery Language Reference ( SQL Server ) 3

Subjects related to XQuery discussed in this subdivision include XQuery rudimentss, XQuery looks, faculties and prolongs, XQuery maps against the xml informations type, XQuery operators against the xml informations type, and extra sample XQueries against the xml Data Type.

Data Quality Services ( DQS )

Introducing Data Quality Services

The Business Need for DQS

Explains why concerns have a demand for DQS.

Answering the Need with DQS

To decide informations quality issues, the DQS provides characteristics such as informations cleansing, fiting, mention informations services, profiling, monitoring, and cognition base.

A Knowledge-Driven Solution

DQS knowledge-driven solution uses knowledge direction procedures and informations quality undertakings to cleanse informations.

DQS Components

DQS comprises of Data Quality Server and Data Quality Client.

Data Quality Functionality in Integration Services and Master Data Servicess

Describes the DQS Cleansing constituent in Integration Services and the Data Quality Processes in Master Data Services.

Data Quality Services Concepts

Knowledge Management Concepts

Describes the procedures used to make and pull off the cognition base.

These procedures include cognition find, sphere direction, fiting policy, and mention informations services.

Data Quality Project Concepts

Describes the assorted constructs of informations cleansing including informations cleaning, informations matching, and profiling and presentments.

Data Quality Administration Concepts

Administrative undertakings utilizing the Data Quality Client application include activity monitoring, constellation, and DQS security.

Data Quality Services Features and Tasks

Data Quality Client Application

The Data Quality Client application is used to execute informations quality operations utilizing a standalone tool.

The client application can be used as a DQS KB Operator, a DQS KB Editor, and a DQS Administrator.

DQS Knowledge Bases and Domains

Describes how to make and construct a DQS cognition base.

Discuss about cognition find, sphere direction, and informations matching.

Data Quality Projects ( DQS )

Explains the assorted Data Quality undertakings including cleansing activity, fiting activity, and informations profiling and presentments.

Datas Cleansing

Describes the assorted facets of informations cleansing such as computer-assisted cleaning, synergistic cleaning, taking value rectification and standardising cleansed informations.

Datas Matching

Explains how to execute informations fiting, constructing a matching policy and running a matching undertaking.

Reference Data Services in DQS

Discusses the usage of mention informations from Windows Azure Marketplace, utilizing mention informations straight from the Third Party Reference Data Providers, and how to cleanse informations by utilizing the mention informations

Data Profiling and Notifications in DQS

Explains how profiling works, profiling informations by activity, profiling informations in activity monitoring, and presentments.

DQS Administration

Describes disposal activities by utilizing Data Quality Client and disposal activities outside of Data Quality Client.

DQS Security

The DQS security substructure is based on the SQL waiter substructure.

Discusses the DQS functions and user direction.

Analysis Servicess

What ‘s New ( Analysis Services )

SQL Server 2012 Service Pack 1 ( SP1 )

New characteristics include PowerPivot in Excel, PowerPivot for SharePoint, spPowerpivot.msi, version compatibility for tabular theoretical accounts, and importing from PowerPivot in Excel 2013.

SQL Server 2012 What ‘s New by Feature

New characteristics are added to the undermentioned countries: waiter case and waiter monitoring, tabular mold, multidimensional mold, PowerPivot for Excel, PowerPivot for SharePoint, programmability, and design tools.

Analysis Services Backward Compatibility

Discontinued Analysis Services Functionality in SQL Server 2012

Migration Wizard, used to migrate SQL Server 2000 Analysis Services databases to newer versions, is discontinued because SQL Server 2000 is no longer supported.

Decision Support Objects ( DSO ) library that provided compatibility with SQL Server 2000 Analysis Services databases is besides discontinued and no longer portion of SQL Server.

Deprecated Analysis Services Functionality in SQL Server 2012

Features non supported in following version of SQL waiter include InsertInto connexion threading belongings, CreateCube connexion threading belongings, SQL Server 2000 PMML, and Create Action statement.

Features non supported in future versions of SQL Server include CalculationPassValue map, CalculationCurrentPass map, NON_EMPTY_BEHAVIOR question optimizer intimation was turned on by default, CELL_EVALUATION_LIST intrinsic cell belongings, and COM assemblies.

Behavior Changes to Analysis Services Features in SQL Server 2012

Cube browser in Management Studio and Cube Designer has been removed

Higher permission demands for utilizing a PowerPivot workbook as an external information beginning.

PowerPivot Gallery: New regulations for snapshot coevals for some PowerPivot workbooks.

New default puting for burden reconciliation petitions changed from Round-Robin to Health-Based

Interrupting Changes to Analysis Services Features in SQL Server 2012

Setup commands removed for a PowerPivot for SharePoint installing.

Analysis Services Features and Tasks

Comparing Tabular and Multidimensional Solutions ( SSAS )

Subjects discussed in this subdivision include informations beginning support by solution type, theoretical account characteristics, theoretical account size, programmability and extensibility support, question and scripting linguistic communication support, security characteristic support, design tools, client application support, SharePoint demands, and server deployment manners for multidimensional and tabular solutions.

Analysis Services Instance Management

Subjects discussed in this subdivision include finding the waiter manner of an analysis services case, configure service histories ( Analysis Services ) , configure waiter belongingss in Analysis Services, configure HTTP entree to Analysis Services on Internet Information Services ( IIS ) 7.0, connect to an Analysis Services case, register an Analysis Services case in a waiter group, gulf users and Sessionss on Analysis Services waiter, rename an Analysis Services case, monitor an Analysis Services case, and script administrative undertakings in Analysis Services.

Tabular Modeling ( SSAS Tabular )

Discuss tabular theoretical account solutions, tabular theoretical account databases, and tabular theoretical account informations entree.

Multidimensional Modeling ( SSAS )

Discuss multidimensional theoretical account solutions ( SSAS ) , multidimensional theoretical account databases ( SSAS ) , multidimensional theoretical account object processing, multidimensional theoretical account divider direction, multidimensional theoretical account functions and permissions, and multidimensional theoretical account assemblies direction.

Data Mining ( SSAS )

Discuss informations excavation constructs, informations excavation algorithms, mining constructions, excavation theoretical accounts, proving and proof, informations excavation questions, informations excavation solutions, informations excavation tools, informations excavation architecture, and security overview ( Data Mining ) .

PowerPivot for SharePoint ( SSAS )

Explains the benefits of PowerPivot for SharePoint.

Subjects discussed include PowerPivot for Excel, PowerPivot constellation in cardinal disposal, PowerPivot Configuration utilizing PowerShell, PowerPivot constellation tool, PowerPivot hallmark and mandate, PowerPivot waiter disposal, PowerPivot server wellness, PowerPivot direction splashboard, PowerPivot dsage informations aggregation, PowerPivot gallery, PowerPivot data entree, PowerPivot information refresh, PowerPivot information provenders, and PowerPivot BI Semantic Model Connection ( .bism ) .

Technical Reference ( SSAS )

Data Mining Stored Procedures ( Analysis Services – Data Mining )

The stored processs include SystemGetCrossValidationResults, SystemGetClusterCrossValidationResults, SystemGetAccuracyResults, and SystemGetClusterAccuracyResults.

Mistakes and Events Reference ( PowerPivot for SharePoint )

Provides information about mistakes and events for PowerPivot for SharePoint. Mistakes in this subdivision are identified by the mistake message text that appears in a log or mistake window.

Analysis Servicess PowerShell

Describes the requirements, supported versions and manners of Analysis Services, hallmark demands and security considerations, and Analysis Services PowerShell undertakings.

PowerPivot Reference for SharePoint PowerShell

List of PowerShell cmdlets used to configure or administrate a PowerPivot for SharePoint installing.

Multidimensional Expressions ( MDX ) Mention

Description of of MDX syntax elements and MDX linguistic communication mention.

Data Analysis Expressions ( DAX ) Mention

Describes the DAX sentence structure specification for PowerPivot, DAX operator mention for PowerPivot, and DAX map mention.

Data Mining Extensions ( DMX ) Mention

Provides information sing DMX statements.

User Interface Reference ( Analysis Services )

Aid subjects for Analysis Services Wizards ( Multidimensional Data ) , informations excavation interior decorators and duologue boxes, Analysis Services interior decorators and duologue boxes ( Multidimensional Data ) , Analysis Services interior decorators and duologue boxes ( Tabular ) , and Data Mining aces.

Integration Servicess

What ‘s New ( Integration Services )

Deployment

Discusses the deployment of undertakings and bundles, deployment of undertakings to Integration Services Server, and Integration Services Server.

Management and Troubleshooting

Describes the waiter environments, SSISDB catalog, trouble-shooting public presentation and informations issues, and studies for trouble-shooting bundle operations.

Development Enhancements

Describes the undertaking connexion directors, offline connexion directors, level file connexion director alterations, parametric quantities, execute bundle undertaking and parametric quantities, comparing and unifying bundles, undo/redo in SSIS Designer, column function, and book undertaking and script constituent.

Performance

Reduced Memory Usage by the Merge and Merge Join Transformations.

Data Quality

Explains the DQS Cleansing transmutation.

Entree to Samples and Tutorials

The Getting Started window in the SSIS Designer provides links to samples, tutorials and pictures.

The SSIS Toolbox in SQL Server Data Tools ( SSDT ) provides links to samples and Help content for Control Flow and Data Flow points.

Integration Services Backward Compatibility

Deprecated Integration Services Features in SQL Server 2012

There are no deprecated Integration Services features in SQL Server 2012.

Discontinued Integration Services Functionality in SQL Server 2012

The characteristics that are discontinued in the current release of SQL Server Integration Services are data beginning positions, informations spectator, Data Transformation Services ( DTS ) , Execute DTS 2000 Package Task, and ActiveX Script undertaking.

Interrupting Changes to Integration Services Features in SQL Server 2012

There are no breakage alterations in SQL Server 2012 Integration Services ( SSIS ) characteristics.

Behavior Changes to Integration Services Features in SQL Server 2012

There are no behavior alterations in SQL Server 2012 Integration Services ( SSIS ) characteristics.

Integration Services Features and Tasks

Integration Services ( SSIS ) and Studio Environments

Describes how to use the SQL Server Data Tools and the SQL Server Management Studio.

Integration Services ( SSIS ) Packages

Discusses the bundle contents, bundle templets, objects that extend bundle functionality, bundle belongingss that support extended characteristics, usage log entries available on the bundle, and constellation of bundles.

Integration Services ( SSIS ) Connections

Describes the different connexions used to execute different undertakings.

Lists out the connexion director types that are provided.

Integration Services ( SSIS ) Undertakings

Explains what is a undertaking, and what is a solution, for Integration Services Projects.

Integration Services ( SSIS ) Parameters

Explains the parametric quantities used with the bundle and undertaking deployment theoretical accounts.

Integration Services ( SSIS ) Questions

Explains the execution of SQL in files and the SQL in variables.

Integration Services ( SSIS ) Expressions

Describes the constituents that use looks, icon markers for looks, and look builder.

Integration Services ( SSIS ) Variables

List of variables that are used in Integration Services, every bit good as their intents.

Integration Services ( SSIS ) Event Animal trainers

Describes the undertakings the event animal trainers can execute, event animal trainer content, run-time events, and configuring an event animal trainer.

Integration Services Service ( SSIS Service )

List of direction capablenesss provided by the Integration Services service.

Integration Services ( SSIS ) Waiter

Explains the intent for Integration Services ( SSIS ) Server, high handiness, and the usage of Integration Services Server in SQL Server Management Studio.

Deployment of Undertakings and Packages

Compares the differences between undertaking deployment and bundle deployment.

List of characteristics of undertaking deployment theoretical account.

Description of undertaking deployment and needed undertakings.

Execution of Undertakings and Packages

List of tools to run an Integration Services bundle.

Describes the executing and logging for Integration Services bundles.

Monitoring for Package Executions and Other Operationss

Describes how logs, studies, positions, public presentation counters, and informations lights-outs are used to supervise Integration Services operation.

List of operation types monitored.

Security Overview ( Integration Services )

Explains how implementing identify characteristics will guarantee that packages merely unfastened from sure beginnings, and how entree control features allow merely authorised users to open and run bundles.

Technical Reference ( Integration Services )

Integration Services Error and Message Reference

List of mistake messages, warning messages, informational messages, general and event messages, success messages, and informations flow constituent mistake messages.

Positions ( Integration Services Catalog )

List of Transact-SQL positions that are used in the disposal of Integration Services.

Stored Procedures ( Integration Services Catalog )

List of Transact-SQL stored processs that are used in the disposal of Integration Services undertakings.

Functions

List of Transact-SQL maps that are used in the disposal of Integration Services undertakings.

Master Data Servicess

What ‘s New ( Master Data Services )

The new characteristics in SQL Server 2012 Master Data Services include utilizing Excel to pull off Master Data, fiting informations before burden, lading informations into MDS utilizing entity-based theatrical production, new theoretical account deployment tools, re-designed and higher-performance web user interface, SharePoint integrating introduced, multi-level recursive hierarchies support, many to many function improved, codifications automatically generated, security is simplified, and the installing is portion of SQL Server.

Backward Compatibility ( Master Data Services )

Deprecated Master Data Services Features in SQL Server 2012

Staging procedure from SQL Server 2008 R2 and metadata will be removed in future editions of SQL Server 2012.

Discontinued Master Data Services Features in SQL Server 2012

Model object permissions can no longer be assigned to the Derived Hierarchy, Explicit Hierarchy, and Attribute Group objects.

New presenting procedure can non be used to make or cancel aggregations, add members to or take members from aggregations, and reactivate members and aggregations.

Other discontinued characteristics include the theoretical account deployment ace, codification coevals concern regulations, majority updates and exportation, and PowerShell cmdlets.

Master Data Services Features and Tasks

Models ( Master Data Services )

Explains how theoretical accounts relate to other objects.

Describes an illustration of a theoretical account.

Entities ( Master Data Services )

Explains how entities relate to other theoretical account objects.

Description of utilizing entities as forced lists, base entities, entity security, and entity illustrations.

Properties ( Master Data Services )

Explains how attributes relate to other theoretical account objects.

Description of needed properties, property types, property illustrations, and related undertakings.

Domain-Based Attributes ( Master Data Services )

Explains the usage of same entity for multiple domain-based properties, how domain-based properties form derived hierarchies, and an illustration of a domain-based property.

Property Groups ( Master Data Services )

Explains how attribute groups change the show, and how to demo or conceal attribute groups, every bit good as related undertakings.

Master Data Services Add-in for Microsoft Excel

List of related undertakings for MDS circuit board for Excel.

Members ( Master Data Services )

Explains how members relate to other theoretical account objects and utilizing hierarchies and aggregations to organizer members.

Description of member types, illustrations and related undertakings.

Minutess ( Master Data Services )

Explains when minutess are recorded, and how to see and pull off minutess.

Description of system scenes and concurrence.

Notes ( Master Data Services )

Explains how to footnote a dealing.

Hierarchies ( Master Data Services )

Describes what hierarchies contain, the sorts of hierarchies, and an illustration of a hierarchy.

Explains that hierarchies are non taxonomies.

Collections ( Master Data Services )

Describes what aggregations contain, and the subscription positions for aggregations.

Business Rules ( Master Data Services )

Describes the creative activity and publication of concern regulations, how concern regulations are applied, the system scenes, and related undertakings.

Validation ( Master Data Services )

Explains when informations proof occurs, and related undertakings.

Versions ( Master Data Services )

Explains when to utilize versions.

Describes version flags, work flow for version direction, consecutive or coincident versions, and related undertakings.

Presentments ( Master Data Services )

Explains how presentments are sent, and when presentments are sent.

Description of system scenes, and related undertakings.

Security ( Master Data Services )

Explains how to put security.

Describes the different types of users, security in the circuit board for Excel, and related undertakings.

Importing Data ( Master Data Services )

List of presenting tabular arraies.

Explains how to originate the theatrical production procedure, log minutess, validate informations, and related undertakings.

Exporting Data ( Master Data Services )

List of subscription position formats, and related undertakings.

Deploying Models ( Master Data Services )

Describes the tools for deploying theoretical accounts, what packages contain, sample bundles, and related undertakings.

Developer ‘s Guide ( Master Data Services )

Describes the usage of Master Data Manager web service, usage work flows, and net waiter namespaces.

Microsoft.MasterDataServices

List of categories that provide the chief entry point for Master Data Services.

Technical Reference ( Master Data Services )

Master Data Services Configuration Manage

A tool used to make or configure a Master Data Services database.

Master Data Services Database

Describes the leaf member presenting tabular array, amalgamate member presenting table, relationship presenting tabular array, and presenting process mistakes.

Master Data Manager Web Application

Describes the adventurer functional country, version direction functional country, integrating direction functional country, system disposal functional country, and the user and group permissions functional country.

SQL Server Replication

What ‘s New ( Replication )

Replication support for AlwaysOn Availability groups.

Reproduction Extended Events.

Support for 15000 dividers.

Replication Backward Compatibility

Deprecated Features in SQL Server Replication

Deprecated characteristics in SQL Server 2012 include RMO, heterogenous reproduction, and Oracle publication.

Interrupting Changes in SQL Server Replication

There are no breakage alterations in SQL Server 2012 for reproduction characteristics.

Reproduction Features and Undertakings

Types of Reproduction

The types of reproduction are transactional reproduction, merge reproduction, and snapshot reproduction.

Reproduction Agents

Describes the distribution agent, log reader agent, merge agent, merge agent, queue reader agent, snapshot agent, and agent disposal.

Security and Protection ( Replication )

Explains how encoding can be used to cut down menaces to a reproduction topology, and how to transport out individuality and entree control.

Description of the reproduction agent security theoretical account, security function demands for reproduction, and reproduction security best patterns.

Explanation on how to procure the distributer, publishing house, endorser, and the snapshot booklet.

Monitoring ( Replication )

Description of the MicrosoftA SQL ServerA Replication Monitor, MicrosoftA SQL Server Management Studio, Transact-SQL and Replication Management Objects ( RMO ) , Alerts for reproduction agent events, and system proctor.

Scripting Reproduction

Describes illustration of automatizing a undertaking with books.

Explanation on book reproduction objects.

Reproduction over the Internet

Reproduction can be carried out over the cyberspace via a Virtual Private Network ( VPN ) or the web synchronism option for merge reproduction.

Heterogeneous Database Replication

Describes the publication of informations from Oracle and the publication of informations to non-SQL Server maps.

Configure Replication for AlwaysOn Availability Groups ( SQL Server )

Lists the needed stairss for configuring reproduction and AlwaysOn handiness groups.

Keeping an AlwaysOn Publication Database ( SQL Server )

Describes how to maintain/remove a published database in/from an Availability Group.

Reproduction, Change Tracking, Change Data Capture, and AlwaysOn Availability Groups ( SQL Server )

Describes publishing house redirection, alterations to reproduction agents to back up AlwaysOn Availability Groups, stored processs back uping AlwaysOn, alteration informations gaining control, alteration trailing, every bit good as the requirements, limitations, and considerations for utilizing reproduction with AlwaysOn Availability Groups.

Log Shipping and Replication ( SQL Server )

Describes the demands and processs for retroflexing from the secondary if the primary is lost.

Database Mirroring and Replication ( SQL Server )

List of demands and considerations for utilizing reproduction with database mirroring.

Explanation on how to configure reproduction with database mirroring, and keeping a mirrored publication database.

Technical Reference ( Replication )

Reproduction Views ( Transact-SQL )

List of the system positions used by reproduction.

Reproduction Tables ( Transact-SQL )

List of the system tabular arraies used by reproduction.

Reproduction Stored Procedures ( Transact-SQL )

List of the stored processs used by reproduction.

Properties Reference

List of information for the assorted reproduction aces and duologue boxes.

Tools Mention

Tools for implementing, administrating, and trouble-shooting reproduction include the SQL Server Management Studio, programming interfaces, and other Microsoft Windows constituents.

Mistakes and Events Mention

Lists the cause and declaration information for a figure of mistakes related to reproduction.

Reporting Servicess ( SSRS )

What ‘s New ( Reporting Services )

SQL Server 2012 Service Pack 1 ( SP1 )

Added support for Power View in Microsoft Excel 2013 and Power View in Microsoft SharePoint 2013.

SQL Server 2012 SP1 Reporting Services study waiter in SharePoint manner now supports SharePoint 2013.

New version of the Reporting Services circuit board for SharePoint that supports SharePoint 2013 and SharePoint 2010.

Position and interact with studies on iOS devices.

Power View

Describes the belongingss of Power View such as establishing on tabular theoretical accounts and coexisting with study builder.

SharePoint Mode

Describes the list of alterations made to SharePoint integrating to better the SharePoint IT decision maker experience, the terminal user experience, and supportability.

Datas Alerts

Data alertness allows the user to specify and salvage informations watchful definitions, run informations watchful definitions, and present informations watchful messages to receivers.

Data qui vives besides provides tools including informations qui vive interior decorator, and informations qui vive directors for users and decision makers.

Report Server Projects in SQL Server Data Tools for Visual Studio

Describes the assorted facet of study waiter undertakings.

Excel Renderer for Microsoft Excel 2007-2010 and Microsoft Excel 2003

Improvements in rendering extension includes increasing maximal rows per worksheet to 1048576, increasing maximal columns per worksheet to 16384, figure of colors allowed in a worksheet is about 16 million ( 24-bit coloring material ) , and ZIP compaction now provides smaller file sizes.

Word Renderer for Microsoft Word 2007-2010 and Microsoft Word 2003

The Word renderer allows SQL Server 2012 to render a study in Microsoft Word.

Reporting Servicess Backward Compatibility

Deprecated Features in SQL Server Reporting Services in SQL Server 2012

Depreciated device information scenes for the HTML rendering extension include ActionScript, ActiveXControls, GetImage, OnlyVisibleStyles, ReplacementRoot, ResourceStreamRoot, StreamRoot, UsePx, and Zoom.

Other depreciated characteristics include Microsoft Word and Microsoft Excel 1997-2003 rendition, Report Definition Language ( RDL ) 2005 and earlier, SQL Server 2005 and earlier usage study points, Reporting Servicess snapshots 2005 and earlier, study theoretical accounts.

Discontinued Functionality to SQL Server Reporting Services in SQL Server 2012

Presently there is no discontinued Reporting Services functionality in SQL Server 2012.

Interrupting Changes in SQL Server Reporting Services in SQL Server 2012

Interrupting alterations include that SharePoint mode server mentions now require the SharePoint site, the Reporting Services WMI Provider no longer supports constellation of SharePoint manner, and Report Model Designer is no longer available in SQL Server Data Tools.

Describes list of alterations to SharePoint mode command-line installing.

Behaviour Changes to SQL Server Reporting Services in SQL Server 2012

Interrupting alterations include that sing points permission will non download shared datasets, Report Server hint logs are in a new location for SharePoint manner, GetServerConfigInfo SOAP API is no longer supported, Configuration Manager is non used for SharePoint Mode, and that server manners can non be changed from one manner to another.

Reporting Servicess Concepts ( SSRS )

Report Server Concepts

Describes the running of study waiter or study waiter configured in native manner, describing Servicess as a SharePoint waiter or study waiter configured in SharePoint manner, study waiter points, booklets, functions and permissions, agendas, subscriptions and bringing, extensions, and study entree.

Reports and Related Item Concepts

Describes the studies and study definitions, study informations connexions and informations beginnings, study datasets, study parametric quantities, study points, informations parts and maps, study parts, and informations qui vives.

Types of Reports

The different types of studies include drilldown studies, subreports, main/detail studies and drillthrough studies, linked studies, history studies, cached studies, snapshots, saved studies, published studies, and upgraded studies

Phases of Reports

The phases of studies are report definition, compiled study and intermediate study format, snapshot or report history, processed study, rendered study, and exported study.

Reporting Services Features and Tasks ( SSRS )

Reporting Servicess Report Server ( SSRS )

List of characteristics and manners provided by Reporting Services Report Server include native manner, native manner with SharePoint web parts, SharePoint manner, study processor, study waiter database, and hallmark and rendition.

Reporting Servicess Reports ( SSRS )

Lists the benefits of describing services studies, and the phases of study processing, making studies, previewing studies, salvaging or printing studies, sing studies, pull offing studies, procuring studies, making presentments based on study informations, upgrading studies, and trouble-shooting studies.

Report Data ( SSRS )

List of tips for stipulating study informations, informations connexions, informations beginnings, and datasets.

Report Parameters ( Report Builder and SSRS )

Describes the common utilizations for parametric quantities, types of parametric quantities, making parametric quantities, study parametric quantities, dataset parametric quantities, cascading parametric quantities, exposing parametric quantity values in a study, puting parametric quantities on a drillthrough study, puting parametric quantities on a subreport, pull offing parametric quantities on a published study, puting parametric quantities on a study URL, puting parametric quantities for a subscription, puting parametric quantities for a snapshot, and parametric quantities and procuring Data.

Report Partss in Report Designer ( SSRS )

Explains the life rhythm of a study portion publication, how to publish/reuse/republish study parts.

Agendas

Subjects described include what can one make with agendas, comparing shared and report-specific agendas, configuring the information beginnings, hive awaying certificates and treating histories, how programming and bringing processing plants, waiter dependences, and the effects of halting the SQL Server Agent and Report Server Service.

Subscriptions and Delivery ( Reporting Services )

Describes the subscription scenarios, standard and data-driven subscriptions, subscription demands, bringing extensions, bringing extensions, and the parts of a subscription.

Datas Alerts ( SSRS )

Describes the information qui vives architecture and work flow, permissions for informations qui vives, nosologies and logging, public presentation counters, support for SSL, informations qui vives user interface, and the globalisation of informations qui vives.

Explains how to put in and configure Data Alerts.

Power View ( SSRS )

Describes the list of characteristics of Power View including making informations visual images, filtrating and foregrounding informations, sorting, and making studies with multiple positions.

Security and Protection ( SSRS )

Describes the hallmark and mandate procedures in Reporting Services.

URL Access ( SSRS )

Discusses the URL entree constructs and related undertakings.

Extensions ( SSRS )

Extensions include security extensions, informations processing extensions, rendering extensions, study processing extensions, and bringing extensions.

Tools ( SSRS )

Described the tools for study authoring, tools for study waiter disposal, and tools for study content direction.

Technical Reference ( SSRS )

Cause and Resolution of Reporting Services Mistakes

List of mistakes and declarations related to Reporting Services.

Report Designer F1 Help

Describes the F1 Help for the SQL Server Reporting Services Report Designer aces, positions, and duologue boxes.

Report Manager F1 Help

List of subjects supplying page-level aid for SQL Server Reporting Services Reporting Manager.

Reporting Services Configuration Manager ( SSRS )

Describes the undertakings that can be performed by the Reporting Services Configuration Manager.

Description of demands.

Explanation on how to get down the Reporting Services Configuration Manager.

Report Wizard Help

Describes a list of F1 aid subjects for the Report Wizard.

HTML Viewer and the Report Toolbar

Description of the constituents and maps of the study toolbar.

Explanations of parametric quantities and certificates.

Device Information Settings for Rendering Extensions ( Reporting Services )

List of device information scenes that are used to go through rendering parametric quantities to a rendering extension.

rs Utility ( rs.exe ) ( SSRS )

Describes the sentence structure, file location, statements, permissions, and illustrations.

rsconfig Utility ( SSRS )

Describes the sentence structure, statements, permissions, file location, comments, and illustrations.

rskeymgmt Utility ( SSRS )

Describes the sentence structure, statements, permissions, illustrations, file locations, and comments.

Reporting Servicess WMI Provider Library Reference ( SSRS )

Describes the MSReportServer_Instance category and MSReportServer_ConfigurationSetting category.

A3

Method 1:

There are 2 ways one can travel approximately finishing this portion of the coursework. First would be by doing SQL questions. To make this, 2 bids are needed:

SELECT * FROM INFORMATION_SCHEMA.TABLES ;

SELECT * FROM INFORMATION_SCHEMA.COLUMS WHERE TABLE_NAME = ‘table_name ‘ ;

The first question returns a list of tabular arraies which are present in the Mondial database. We can see the list of tabular arraies on the following page.

Using the tabular array names listed here, we can utilize the 2nd question to return the full information scheme for each tabular array. An illustration is shown on the following page.

The full list of information displayed by the bid is listed below:

TABLE_CATALOG

TABLE_SCHEMA

TABLE_NAME

COLUMN_NAME

ORDINAL_POSITION

COLUMN_DEFAULT

IS_NULLABLE

DATA_TYPE

CHARACTER_MAXIMUM_LENGTH

CHARACTER_OCTET_LENGTH

NUMERIC_PRECISION

NUMERIC_PRECISION_RADIX

NUMERIC_SCALE

DATETIME_PRECISION

CHARACTER_SET_CATALOG

CHARACTER_SET_SCHEMA

CHARACTER_SET_NAME

COLLATION_CATALOG

COLLATION_SCHEMA

COLLATION_NAME

DOMAIN_CATALOG

DOMAIN_SCHEMA

Domain name

While this method works, there is a much consecutive frontward method for finishing this undertaking of the coursework. That would be acquiring the information we need straight from the GUI via the Object Explorer.

Method 2:

From the object adventurer, we can really easy see all the tabular arraies that are in the Mondial database by merely voyaging to DatabasesMondialTables as shown in the screenshot below.

From the Object Explorer, we can besides fins out the properties for each tabular array. This can be done by voyaging to the Keys booklet for each tabular array and snaping on the corresponding key.

This will expose a window shown below. From this window, we can see the basic scheme the Borders tabular array, every bit good as which is the primary key ( s ) , which values ( s ) can accept null, every bit good as their informations type.

While I am certainly there might be state of affairss where I would necessitate to utilize SQL queries the hereafter, but this Object Explorer provides a speedy and easy manner to look into the tabular arraies and their properties within. So here are all the information listed in each tabular array:

Boundary lines

Rows: 305

City

Rows: 3051

Continent

Rows: 5

State

Rows: 195

Desert

Rows: 30

Economy

Rows: 195

Encompasses

Rows: 198

Ethnic group

Rows: 520

Geo_desert

Rows: 58

Geo_island

Rows: 212

Geo_lake

Rows: 174

Geo_mountain

Rows: 100

Geo_river

Rows: 448

Geo_sea

Rows: 567

Is_member

Rows: 7766

Island

Rows: 162

Lake

Rows: 90

Language

Rows: 110

Located

Rows: 213

Merges_with

Rows: 34

Mountain

Rows: 87

Organization

Rows: 168

Politicss

Rows: 195

Population

Rows: 195

State

Rows: 1382

Religion

Rows: 406

River

Rows: 132

Sea

Rows: 22

B1

Harmonizing to the survey usher, a database system is one which:

Retrieves informations

Insert new informations

Delete unneeded informations

Update old informations

Besides, the survey usher defines a database direction system ( DBMS ) as a package used for specifying, accessing and keeping a database.

Therefore, by the above definitions, we can non see an online text edition plus hunt engine as a database or DBMS. The text hunt map is merely capable of information retrieval. There is no information interpolation, omission of informations, or updating of old informations. Besides, in a database, all informations is structured, and categorized in proper rows and columns. On the other manus, the information in the online text edition is unstructured. Therefore, based on all this, the text edition hunt map can non be considered as a database.

B2

A information entry clerk is an unskilled occupation.

Mistakes in computing machine informations entry accounted for 13 % of the 235000 medicine mistakes, which is 27711 informations entry mistakes.

Data entry mistake ( the skip of a individual missive in the bids issued to the investigation ) was the cause of failure.

The more skilled and experient operators might be more self-satisfied and pay less attending to their work as they tend to automatize everyday actions. Besides while users are used to managing typing mistakes, the delete maps may non be similar across different word processors which could ensue in transportation mistakes.

71 % to 98 %

B3

10 tabular arraies x 500,000 rows x 8 columns x 8 characters or figures = 320,000,000 characters or figures entered.

Lowest mistake rate mentioned in paper: 0.02 %

3200000000 ten 0.02 % = 64000 mistakes

320000000 ten 0.5 % = 1600000 mistakes

B4

4657-4686-6987-0760

Credit Card Number

4

6

5

7

4

6

8

6

6

9

8

7

0

7

6

0

Ten

Double every other

8

6

10

7

8

6

16

6

12

9

16

7

0

7

12

0

Ten

Sum of Digits

8

6

1

7

8

6

7

6

3

9

7

7

0

7

3

0

=85

4658-4686-6987-0760

Credit Card Number

4

6

5

8

4

6

8

6

6

9

8

7

0

7

6

0

Ten

Double every other

8

6

10

8

8

6

16

6

12

9

16

7

0

7

12

0

Ten

Sum of Digits

8

6

1

8

8

6

7

6

3

9

7

7

0

7

3

0

=86

4659-4998-6988-0760

Credit Card Number

4

6

5

9

4

9

9

8

6

9

8

8

0

7

6

0

Ten

Double every other

8

6

10

9

8

9

18

8

12

9

16

8

0

7

12

0

Ten

Sum of Digits

8

6

1

9

8

9

9

8

3

9

7

8

0

7

3

0

=95

B5

The Entity Integrity Constraint provinces that no primary cardinal value can be void. Therefore, based on the scheme for tabular arraies listed in portion A of this coursework, the Entity Integrity Constraint is enforced on the Continent tabular array as its primary key does non incorporate a void value.

The attribute unity restraint is enforced on the Economy tabular array as it contains a CHECK restraint for the GDP property.

As there are no foreign keys present in the Mondial database, it is impossible to state whether Referential Integrity is enforced.

B6

B7

Groom – & gt ; & gt ; Horse

Horse – & gt ; Groom

Groom – & gt ; & gt ; Horse

Horse – & gt ; & gt ; Groom

Groom – & gt ; Horse

Horse – & gt ; Groom

Groom – & gt ; Horse

Horse – & gt ; & gt ; Groom

B8

Aphra and Betty is incorrect. Reason being, there will be database mistakes should a pupil fail a topic, and will hold to recapture an test the undermentioned twelvemonth. The database will non be able to separate the pupil Numberss and topics from one twelvemonth to the following.

Gemma is right.

Bot is wrong. For illustration, should a pupil fail and test this twelvemonth, and pass the test following twelvemonth, the database might non be updated right.

B9

B10

Mumps

hypertext transfer protocol: //thedailywtf.com/Articles/A_Case_of_the_MUMPS.aspx

MUMPS, besides known as Massachusetts General Hospital Utility Multi-Programming System, is a programming linguistic communication developed in the late sixtiess by Neil Pappalardo and his co-workers at Massachusetts General Hospital. Harmonizing to its wiki article, MUMPS has an in-built database which supports and hierarchal construction made of thin arrays. Presently, health care and fiscal sectors are 2 industries which still use MUMPS which make up their information systems. Harmonizing to Papadimoulis ( 2007 ) , similar to linguistic communications such as COBOL and FORTRAN, MUMPS is an ripening and disused scheduling linguistic communication.

MUMPS supports merely one cosmopolitan datatype, intending that information is coerced and parsed as a twine, whole number, or other datatypes as and when the context requires. Commands and maps may be abbreviated up to 3 characters in length. This aided multi-programming which resulted in better public presentation as plans are much more compact. However, as MUMPS coders by and large used their ain abbreviations without noticing on codification, it is highly hard for other coders to understand and keep the plan without analyzing the codification line by line.

NoSQL

Unlike what the name implies, NoSQL is really normally interpreted as “ non merely SQL ” instead than literally “ no SQL ” . The first clip the term NoSQL was used back in 1998 by Carlo Strozzi to depict his ain derived function of the traditional relational database system, which does non use SQL. Harmonizing to him, his NoSQL differs from the recent NoSQL motion in that the former is a chiseled package bundle and is relational, while the latter is a construct of a non-relational database system ( which besides follows the construct of non utilizing SQL ) .

Harmonizing to Finley ( 2012 ) , the more recent NoSQL motion can be traced back to Google and Amazon. Both companies required new databases which were designed to run across a big figure of waiters in order to hive away and treat ‘big informations ‘ . Said ‘big informations ‘ refers to a monolithic sum of informations ( in the range of PBs, EBs, etc ) which traditional relational databases could non get by. As a consequence, Google created BigTable, and Amazon created Dynamo to cover with ‘big informations ‘ and back up their spread outing online services. Other companies so sought to retroflex the database created by both companies after Google and Amazon published research documents detailing their several databases. This led to the addition in popularity and execution of NoSQL databases. Facebook is besides another company which utilizes NoSQL databases.

So, what are the characteristics of NoSQL that makes the construct so popular? As mentioned, NoSQL databases are needed when there is a demand to pull off ‘big informations ‘ that usual relational databases solutions could non get by. Fowler and Sadalage ( 2012 ) reiterates that NoSQL does non adhere to the relational database theoretical account, and therefore does non use SQL. NoSQL does non utilize any fixed scheme, so any informations can be stored in any record alternatively of fixed columns and rows. Besides, NoSQL databases are built to stretch across multiple waiters alternatively of being constrained to a individual waiter in the instance of a relational database. This is of import for companies as it is much more expensive to purchase or upgrade to a bigger waiter, while it is much more cost effectual and efficient to scale horizontally by buying more machines alternatively ( organizing a big bunch called a ‘cloud ‘ ) . This implies that relational databases are much more expensive to keep and upgrade than NoSQL databases. Furthermore, while the single machines in the cloud may neglect, the overall bunch remains dependable. Therefore, NoSQL waiters are much more economic, scalable, and flexible compared to relational databases.

Fowler and Sadalage ( 2012 ) further explains that such bunchs can assist cut down developmental retarding force. Reason being, in most modern relational databases, much clip is spent mapping the assorted dealingss. Therefore by implementing a non-relational database in the instance of NoSQL, in theory, development clip can be greatly reduced. However, this can merely use in instances where a non-relational database would be preferred, and a relational database is still preferred for many conventional state of affairss. Fowler and Sadalage ( 2012 ) besides mentions that the big waiter bunchs which support NoSQL databases are of an evidently much larger graduated table compared to relational databases, and is therefore able to hive away a much more about of informations ( in the PBs ) and procedure and analyse that informations much more expeditiously.

However, as a comparatively recent construct, NoSQL does hold some drawbacks as good. Harrison ( 2010 ) summarizes the disadvantages of NoSQL into the undermentioned points: deficiency of adulthood, inferior support ( as compared to relational databases ) , trouble of disposal, less effectual concern intelligence, and a general deficiency of expertness among developers.

In decision, while NoSQL is much better than relational databases when managing large informations, relational databases are still relevant today due to the adulthood of the platform and the great support and expertness available. Furthermore, non all companies need to manage large informations, and as such the relational database is still the most feasible and conventional solution. For the companies that do need to cover with large informations, they can surely see implementing NoSQL solutions to run into their concern demands. However, as mentioned, the general deficiency of adulthood, expertness, and support could be possible barriers to a company that is sing NoSQL solutions.

 

Development Planning and Management

This study explores land use planning in the Agona and Atuabo oil and gas enclave. The study assesses land use planning procedures, which are followed or considered in developing oil and gas find communities/regions, to understand the motivation for urban planning intervention/ requirement in the emerging oil and gas regions, to examine how land use planning procedures and responses before and after the oil find in the region has changed or remain the same to respond to the changing situation and lastly to draw lessons for improvement in planning practice and management.

The case study research method was employed in this study focusing on land use planning in the Agona and Atuabo oil and gas enclave. Three data collection techniques were employed: semi-structured interviews, field observations and secondary data analysis. Using purposive sampling method, seven institutions whose activities are related to land use planning process were purposely selected to provide an understanding into the study and 368 household heads. The data were analysed using AutoCAD software programme and by transcribing the interviews.

The land use planning processes in the oil region enclave have undoubtedly responded to the emerging situation by optimising land following the discovery of oil and gas to provide for public infrastructure to meet the growing population density with the aim of protecting the environment which are the motivation for land use planning in Agona and Atuabo. Other findings from the study also indicate that physical and socio-economic importance of land use planning such as increase in land values, improved accessibility etc.

The study recommends that, Ahanta West and Ellembelle District Assemblies should be encouraged to pursue district spatial development framework, structural plans and local plans as well as updating outdated land use planning schemes within statutory timeline. The study also recommended that land use planning should always precede physical development. The study recommended the development of a strategic land use management system to sustain the achievements of land use planning in Agona and Atuabo

My deepest gratitude goes to Allah for His abundance blessing, favour, guidance and direction during my studies.

I am very thankful to my supervisor Dr. Patrick Cobbinah. He encouraged and supported me through the writing of my thesis, and provided substantial help in in making this project successful with his support, direction and guidance .

I would also want to appreciate Mr. Benedict Arkhurst (LUSPA HEAD OFFICE) for his tremendous support, my sincrer gratitude also go to staff of Department of Planning, Kwame Nkrumah University of Science and Technology (KNUST), not forgeting the Staff of Ellembelle and Ahanta West District Assembly, particulary to the Physical Planning Directors of both districts.

To my friends who supported me in this research especially my course mates, and colleagues at work place, I say may Allah bless you all.

Words alone can not express my heartfelt appreciation to my family especially to my beloved wife AZIZA ABUBAKAR, Daughter and Mum for their prayers. I could not have achieved this without your love, support, and patience over these years. Thank you and Allah bless you.

Natural resources are extremely important in the world’s economy and trade. The Management of natural resources has been one of the herculean tasks for developing countries (e.g. Angola, Nigeria, Sudan, Republic of Congo) with such natural resources to contend with due to, among others, inadequate planning (Sachs and Warner, 2001).

Obeng-Odoom (2009) maintains that the importance of the oil and gas cannot be underestimated in accelerating the rate of urbanization both within and around the regions where oil resources are mined. Papyrakis and Gerlagh (2003) observe that oil-producing countries (e.g., Norway, etc.) have experienced an increase in income level and turned their oil resource to create prosperity and wealth for the next generation. Mabe (2013) establishes the rise in land values particularly in the capital of Ghana’s oil find region (Sekondi-Takoradi) as much as 62.5% in the year 2008 and 2009 because of oil-induced migration. Obeng-Odoom (2009), suggests that the increasing rise in land prices is related to the increasing need for development in infrastructure and social services to meet the demands of the oil find area, which is characterized by increase of living and housing costs etc. Crawford (2010) argues that the oil industry offers employment opportunities to people who move to into the region with the hope of finding a job.

Omajemite (2008) notes that oil industries in Niger Delta have introduced pollutions into environment. Despite the job opportunities created by the oil and gas industry in oil find regions, Palley (2003) observes one major challenge of the oil industry is the loss of economic livelihood of the people in the oil find region. The barring of fishing activities around oil and gas territories affect the livelihoods of these people. Palley (2003) also argues that the pollution of water sources, increase temperature, dusty-air, taken away arable lands and loss of local economic ventures which serve as sources of livelihoods for indigenes are problems which are commonly associated with oil and gas industries. Another challenge posed by the activities of oil and gas sector is the creation of new settlements, which are primarily dormitory settlements. These settlements exhibit characteristics like bad roads, inadequate supply of social services, and poor environmental condition, which are in direct contrast to planned settlements. Obeng-Odoom (2015) observes that oil discovery increases road traffic in Sekondi-Takoradi because of increase in both human and vehicular traffic, which increases travel time.

According to Obeng Odoom (2014), the challenges of land use planning in rich natural resource regions include, the increasing competition for space among human activities. Obeng-Odoom (2009) cites rapid urbanization, which progressively complicate and exacerbate inter-related problems of emerging oil find areas as another challenge posed by the oil and gas sector to land use planning. UN-Habitat (2009) opines that the lack of resources and technical capacities to manage or address the crises within emerging oil and gas regions is another challenge to land use planning. Adarkwa (2012) observe that ineffective development control mechanism to check development is one major challenge to land use planning.

Land-use planning, a general term used for a branch of planning which encompasses various disciplines which seek to order and regulate land use in an efficient and ethical way, thus preventing conflicts of various land uses. Lafferty and Frech (1978) observe that land use planning eliminates negative externalities among conflicting land uses. Randolph (2004) further emphasize that land use planning protects natural environments and consequently promoting the location-specific distribution of public facilities. Muro and Puentes (2004) explain that the provision of adequate amount of public goods and services are efficiently pursued under land use planning. Dawkins (2000) argues that land use planning reduces uncertainty and transaction costs involved in the land development processes. The trend of oil induced urbanization in emerging oil and gas regions present another urgent need for land use planning in oil find regions.

There is enough evidence from literature to suggest that there are undesirable consequences for the failure to seek for land use planning intervention in emerging oil and gas regions. Given the significance of land use planning, this study focuses on land use planning in the midst of riches. A case of Atuabo and Agona oil and gas region.

According to the Western Region Spatial Development Framework (2012), Western and Western North regions cover about 10% of Ghana’s land area, but contributes over 50% of its wealth. The two regions are endowed in terms of climate and natural resources, and has attracted high levels of investment over the past centuries. However, these investments have not always been beneficial to the population and environment mainly due to inadequate planning (WRSDF, 2012). The dominant sectors of the regions’ economy are mining, agriculture and recently oil and gas as well.

Joe (2013) explains that in June 2007, Kosmos Energy of the United States, in partnership with Anadarko Petroleum Corporation, Sabre Oil and Gas Ltd., EO Group, Tullow Ghana Limited and Ghana National Petroleum Company (GNPC) announced the discovery of large deposits of crude oil to the Government of Ghana. The field was later named as Jubilee Field and currently another field called Tweneboa Enyenra Ntomme (TEN Project) and newest discovery known as Sankofa and Gye Nyame Project are all under the study environment. Jubilee field commenced commercial production in the fourth quarter of 2010 for crude oil and in the first quarter of 2015 for natural gas. Oil operations at TEN, Sankofa and Gye Nyame Fields are also ongoing at Western and Central Coastlines of Ghana.

 

Client Server Architecture And Multiple Site Processing Computer Science Essay

A fluctuation of the multiple-site processing, single-site informations attack is known as client-server architecture. Client-server architecture is similar to that of the web file waiter expect that all database processing is done at the waiter site, therefore cut downing web traffic. Although both the web file waiter and client/server systems perform multiple-site processing, the latter ‘s processing is distributed. Note that the web file waiter attack requires the database to be located at a individual site. In contrast, the client/server architecture is capable of back uping informations at multiple sites.

The client/server architecture features a user to resources, or a client, and a supplier of resources, or a waiter. Client waiter architecture developed as a response of file sharing architecture which require dozenss of bandwidth and besides stall or jam the web. In client waiter architecture the file waiter ever replace by database waiter. Any user had any questions they answered by Data Base Management Systems ( DBMSs ) . Which peculiar questions was being asked merely that questions are answered, merely that file are transfer alternatively of whole file that slow down web.

The most & A ; primary linguistic communication for structuring questions are SQL bases for Standard Query Language, RCP stands for Remote Procedure Call. SQL are ever utilizing Graphic User Interface ( GUI ) to bespeak signifier database. In SQL there are different type of version are available for major sellers from Microsoft, Oracle. RCP ( Remote process call ) set of regulations of to bespeak that is used by one plan to bespeak informations services from another plan on another computing machine in another web.

There is no demand to acquire full cognition of web inside informations in RCP. In RCP they allow different type of application and those applications are accessible from different platform. Client waiter stubs are created severally so each client waiter has a subdivision it needs for remote map it request. The chief purpose of stub is to when a distant map is required by communicating every bit good as application between both client and waiter in clip of synchronal. With the aid of RCP client-server package are easy design and that employs multiple plans distribution over a web.

Plan or computing machine that petitions and receives service from another plan or computing machine. The device that was non capable to running their ain base entirely plan but it can go on by distant computing machine via web. In client waiter is use cyberspace where user are connect to service which are runing on a distant system through cyberspace protocol. Client is web browser were they connect web waiters and recover web page show. There are different type of client are available for different intent. Most of user are use e-mail client to have their electronic mail from internet service supplier ‘s mail storage waiters. In client another type is on-line confabs are depending on confabs protocol most of user are use on-line confab with their friends and household. User can besides play online multiplayer game in the computing machine that user are called game clients.

Day by twenty-four hours progressively big Numberss of client application are connected to net sites, doing the browser a kind of cosmopolitan client. This avoid to download big figure of package on any computing machine user have to on their application illustration mail services. In workstation computing machines and personal computing machines the different between client waiter runing system it ‘s merely affair of work load in waiter they contain more operating system constituents, and besides let immediate login, server are more expensive so client. In client version may be incorporate more terminal user package. In client their three different types are available. The three types of client are shown below.

Thin Client.

Fat Client.

Hybrid Client.

The thin client is little type of client. Thin client are belongings of cardinal computing machine. The basic occupations of thin client is depend on in writing show image provide by application waiter. The thin client included programming environment like Java, Asp, Ajax, and PHP.

In fat client they are executing a big figure of informations processing operation itself without aid of waiter. The Rich client or Thick client is two different name of Fat client. The fat client is chiefly signifier of personal computing machines and laptops can operates independently.

The intercrossed client is mixture of above two types of client Fat and Thin. The intercrossed client is related to fat client but in intercrossed client there is usage of waiter.

The waiter is a combination of hardware and package to supply service to client.

In computing machine system architecture develop entirely with capablenesss of hardware usage at the clip of application. The fastest and simplest of all is Mainframe Architecture in that all procedure of map and operation with in cardinal computing machine. With the aid of dense terminal user can interact in cardinal computing machine to end direction through capturing key strokes to the cardinal computing machine and expose the consequence to direction for user. Such applications are based and comparative despatch big computing machine power of mainframe cardinal computing machine.

 

Fantasy’s Killer

“Through out history seldom has an individual been able to hold a city in fear. Most times people will just either ignore the individual, let the police handle the situation, or call them wacko or crazy. But then there are the extreme cases. On this end of the scale people may have extreme mental problems or very strong motives, so extreme or so strong that they captivate an entire city or even nation.

Jack the Ripper did it by killing and murdering five prostitutes. In recent times it has been people like Charles Manson and the Boston Strangler, who assaulted and murdered thirteen women keeping the entire city captivated in a state of fear. Serial Killers…

Fantasy plays a very important role in creating and or modeling a serial killer and it is shown through remorse, uniform and weather or not they have or follow a trademarked style.

All of these elements combined expose the fantasy portrayed by a serial killer. Pain, does a serial killer have or feel pain for their victims? Sometimes the killer may depending on the psychological state they are in.

But then regarding the fact most serial killers grow up in violent households one might say that the killer does not or even can’t feel remorse because violence is what they were brought up in and to believe is true so it is what they have been conditioned to believe.

Or you could take the approach that the killer feels they are being threatened by their victims so they had to be removed from the picture altogether. Or finally it might be the killer is so mentally unstable that they have very little to no consciences awareness of what they are doing.

But what if they do feel sorrow? Can a serial killer feel sorrow? One researcher says no. “The inner workings of the mind of a serial killer cannot grasp the feeling of any kind of sorrow or remorse”. Therefore a serial killer is hate manifested into a physical state and may or may not have motives to kill. Serial killers may be killing For many reasons weather it be for revenge or whatever but one thing remains as a standard among killers you have to be smart to stay alive. Once you have that intellect comes avoidance from the law and your capture.

One of the best cases of a extremely intelligent killer was the case of the Zodiac Killer in California. His uniform was to kill couples parked in cars, by shooting through the window with a .44 gun. Then he changed his uniform victim to single women who were walking. Then once again he changed uniform this time back to couples but only if they were not in a car. Thus showing he had a particular type of victim, although he changed it, it still followed suit he was just smart enough not to become a pattern killer and changing his uniform was so that the police would have little chance of catching him.

Many killers have a uniform, a favorite way of killing if you like, Sort of a pattern. This pattern mostly applies to the victim though. “Serial killers tend to go after women and children. However some homosexual killers enjoy hunting gay men.” The motivation of these killing spree’s are to some extent unknown but it is speculated that “Their methodical rampages are almost always sexually motivated.” Yet there are some killers who kill at random with little to no motivation and no uniform victim.

These killers travel around and kill whoever and whenever the feel it necessary, “This human predator claims more than three victims with a cooling off period in between each killing.” Some people attribute a killer not following a uniform the reason their killing rampage can continue for so long, yet also there are some killers who follow uniform and still get away with a lot for instance Pedro Alonso Lopez. Enter the most prolific and successful killer of all time. Pedro killed 300 estimated on the low end and 450 on the high end, all were teenage girls. Pedro’s killing spree took him on a tour over three nations. He would have gone free except a flash flood unleashed the bodies from their graves. Pedro like so many others had his way of killing his on trademarked style. In this part of the essay certain serial killers will be introduced and a description of their styles will be given. Warning: this part contains graphic material.

First of all there is the Green River Killer Who “using a stolen ambulance and police car would lure his victims inside and then brutally beat them and rape them their corpse he would then leave beside a river. He is believed to have killed 48+ victims . Next is “Gilles de Rais”. An ally of Joan of Arc in fifteenth century France Gilles started out as a war hero but then turned to torture and murder. “He enjoyed killing mostly young boys, who he would sodomize before and after the decapitation. When he wasn’t feeling up to the task he enjoyed watching his servants butcher the boys and masturbated over their entrails.”

Then comes “Andrei Chikato: The soviet Hannibal Lecter.” He lived in Rostov Russia, about 500 miles from Moscow, where he lived a quiet married life as a teacher. What the town didn’t know was that Andrei preyed on small children. He stalked many of his victims in train and bus stations and had a penchant for disembowelment and mutilation. He was also a cannibal and a sadist. And last is “John Wayne Gacy”. John liked to dress in a hand made pogo the clown outfit to entertain children. This lonely and sadistic contractor also liked to young boys privately in a very different fashion.

The prototypical organized killer, he had all the aspects of the murder worked out before each kill. Once he entered murderess fantasy there was no turning back. He enjoyed handcuffing his victims, anally raping them, beating them to a pulp, then offering them a peanut butter and jelly sandwich, finally he would recite passages from the bible and strangle them to death.” With this it brings to mind the saying “to each his own” show each different killers mind works in its own unique way having it’s own pleasure.

“Serial killers similar enough to put into one category, but different enough to make them difficult to study.” Although we may look inside the mind of a serial killer and in a medical sense understand how it works we cannot yet grasp the mental concept Behind why a serial killer kills. It cannot be denied that a serial killer kills however we can try to understand how to subdue these mad rampages displayed by a another human being. In that sense we don’t really understand why anyone kills for “A mere slip of the hand on the steering wheel can turn a normal person into a killer.”

So until we can fully understand the motives and workings of an individuals mind we cannot stop them from thinking the thoughts they think or conjuring the ideas they want. So until then the world has to move on.

 

Databases role in the modern world

Over a period of time, humankind have been struggling in the society to adjust and cope up with the modern day challenges. In order to survive, a person must have the capability of generate unique and innovative idea, understand accessed previous data, formulate rational decisions and create significant changes in the pool of knowledge. In order to these, a person must have the ability to manage a vast collection of data and must be able to sort out the available resources readily available.

Day-by-day, we keep on surfing to different types of information to be able to understand realities in life so that we could answer problems and make our life even easier. This type of practice to save data and access data in many forms started since the ancient times and continued to prosper even up to this day. We have already evolved from using stones, papers, up to know in the modern world where we used computers with the aid of databases.

A database is a collection of information that is organized so that it can be easily accessed, managed and updated (

In addition, according to Robert J. Robbins, a database is a persistent, logically coherent collection of inherently meaningful data, relevant to some aspects of the real world.

Moreover, a database (DB), in the most general sense, is an organized collection of data. More specifically, a database is an electronic system that allows data to be easily accessed, manipulated and updated. In other words, a database is used by an organization as a method of storing, managing and retrieving information.

Modern databases are managed using a database management system (DBMS) (Technopedia.com).

Technopedia explains Database (DB). Software programmers are well acquainted with database concepts through relational databases like Oracle, SQL SERVER and MySQL, etc. Typically, a database structure stores data in a tabular format.

Moreover, Database architecture may be external, internal or conceptual. The external level specifies the way in which every end-user type comprehends the organization of its corresponding relevant data in the database. The internal level deals with the performance, scalability, cost and other operational matters. The conceptual level perfectly unifies the different external views into a defined and wholly global view. It consists of every end-user required generic data.

Databases are used to support internal operations of organizations and to underpin online interactions with customers and suppliers.

Databases are used to hold administrative information and more specialized data, such as engineering data or economic models. Examples include computerized library systems, flight reservation systems, computerized parts inventory systems, and many content management systems that store websites as collections of webpages in a database.

Databases can make your organization much more efficient and give management valuable insights. They help make sense of your information. They can help you make your products and services more valuable. They can help you sell more.

For example, if you own an online store, you could use a database for your website to keep track of customer data, purchases, prices, and other information. This can be transferred directly into your accounting system saving you the time to collect the data, find the corresponding spreadsheet, and input the data yourself.

In example, Jenny would like to put up a coffee shop along a busy street in the city. Database could be used as an inventory check of available materials, ingredients and resources which are still available or out of stock. She could also use it to determine the best seller or frequently ordered type of coffee and use this data in order to come up with a promo or strategy to increase the sales of the shop.

Also, DB provides the following functions: Concurrency: concurrent access (meaning ‘at the same time’) to the same database by multiple users; Security: security rules to determine access rights of users; Backup and recovery: processes to back-up the data regularly and recover data if a problem occurs; Integrity: database structure and rules improve the integrity of the data; and Data descriptions: a data dictionary provides a description of the data.

On the other hand, the advantages of DB are: Reduced data redundancy, Reduced updating errors and increased consistency, Greater data integrity and independence from applications programs, Improved data access to users through use of host and query languages, Improved data security, Reduced data entry, storage, and retrieval cost, Facilitated development of new applications program.

The disadvantages of DB are: Database systems are complex, difficult, and time-consuming to design Substantial hardware and software start-up costs, Damage to database affects virtually all applications programs, Extensive conversion costs in moving form a file-based system to a database system and Initial training required for all programmers and users.

According to Chirantan Basu in his article Disadvantages of databases, Business databases may reside on desktop hard drives, corporate servers or remote servers. Small businesses can use external business databases for industry and competitive information and internal databases for storing sales and other data. Although databases allow businesses to store and access data efficiently, they also have certain disadvantages.

The first disadvantage is Complexity. Databases are complex hardware and software systems. In addition to the storage media, databases have software modules for accessing, manipulating and displaying the data. Database design and development is a complex undertaking, requiring experienced designers and significant financial resources. Therefore, small and large businesses typically acquire commercial databases and customize them to fit corporate requirements. Businesses should consider scalable database architectures to accommodate future growth.

The second is Cost. Databases require significant upfront and ongoing financial resources. Developing or customizing database management systems may involve frequent changes to system requirements, which lead to schedule slippages and cost overruns. A small business can reduce database management costs by taking advantage of cloud computing technology, which involves storing data on next-generation databases hosted by large infrastructure companies. Cloud computing reduces the need for large capital investments in information technology, thus allowing small businesses to allocate limited financial resources to other operational requirements.

The third issue is on Security. Companies need to know that their database systems can securely store data, including sensitive employee and customer information. Media reports of computer viruses and spam are common, as are reports of systematic hacking of corporate networks. Cloud computing can increase the security risks because the data are stored at third-party data centers, which can be located in foreign jurisdictions subject to different privacy and data integrity laws.

The last disadvantage is on Compatibility. There is a risk that a database management system might not be compatible with a company’s operational requirements. For example, if a small business wants to store detailed customer information or email traffic, it may find that its existing database cannot support the additional data fields or the new queries and reports. This is why scalable applications and cloud-based systems might be a more cost-effective solution for a small business because it can add functionality and resources as its needs evolve. Some database vendors offer scalability in the form of plug-in modules. For example, a company can first implement a database management system for its accounting operations and then add in the other operations later.

One of the major challenges for the IT companies today is how to manage large growing volumes of data and how to produce a quality driven software product ensuring optimal utilization of resources with minimum cost. The database management system is a software system i.e. a set of programs that provides its users with processes for defining, building, manipulating and sharing databases amongst the users and applications. A cloud data base management system is a database management system for management of cloud data and provides delivery of computing as services rather than as product. In this paper we have proposed an architecture for management of data in cloud termed as “Cloud Database Management System Architecture”. The cloud database management system provides an approach for management of cloud data. The cloud data are spread over the internet and are stored to a remote server managed by a third party. Hence, the cloud data management is a major issue which needs to be catered to. A well-defined architecture is thus required to manage the cloud data, available at a remote location. In this work an architectural model for cloud database management system has been developed .This architecture is based on the three schema architecture for data base management system and three level object oriented database management system architecture. (Cloud Database Management System Architecture. Available from: /270791476_Cloud_Database_Management_System_Architecture [accessed Dec 17 2018])

Further, Database is an organized collection of data, and is the heart and soul of any information system. Cloud infrastructure consists of huge volumes of data which might be shared amongst multiple tenants .Thus, data management in particular is an essential aspect for storage in cloud. The data are distributed in cloud across multiple locations and might contain certain privilege and authentic information. Therefore it’s very important to ensure that data consistency, scalability and security are maintained. In order to address these issues and several other critical issues regarding data, a data base management system for cloud data is imperative. In cloud two primary DBMS architectures are used shared nothing and shared disk. Shared nothing is a distributed computing architecture in which each node is self-sufficient and is independent of any other node i.e. each node in shared nothing architecture has its own memory and disk storage and does not share it with any other node. There is no point of contention in the nodes. Shared disk architecture is a computing architecture in which each node has its own memory but they share disk storage space. It actually partitions the data such that each database server processes and maintains its own piece of database. In this paper we will first discuss shared nothing and shared disk architectures, their limitations and then propose our cloud data base management system architecture.

As they say, anything in this world has always have its own positive and negative impact depending on how you are going to use or apply in the betterment of every human activity. Despite of the above-mentioned disadvantages of the use of databases in the modern world it will never outweigh the advantages of this could bring on the organization, business, industry and other fields of specialization. Some weaknesses of these DBs must be viewed as mere challenges to improve the quality of existing DBs. we should not stop making possibilities within our hands in order to handle bigger and bigger volume of data that keep on growing from time to time.

Overall, Database is very important in the processing of data that we have especially now where we have a lot of things to generate, store and retrieve. We must be very careful of the data that we have to use and store. Security is one of the priority concerns that we must think with what we have. We must increase our ability to deal with the data that we have and ensure that it must be well-managed and useful to everyone even though we have a lot of constraints.

 

Forecasting electricity consumption demand with weather

The major element of electricity resource planning is forecasting the upcoming electricity consumption. Precise forecast of electricity consumption is of primary importance in developing countries’ energy planning. In the last decade, several new techniques have been used for energy consumption planning to accurately forecast future electricity consumption requirements. Consumers’ living standards may vary depending on their weather sensitivity. Therefore, electricity demand is influenced by weather changes. This paper reviews the study on weather impacts on electricity demand in Sri Lanka.

The study involves developing several regression model designs and selecting the best model which is capable of achieving the most accurate results. After the most reliable model is used to forecast electricity consumption and to make personalized consumer profiles. The model developed produces very satisfying outcomes and the electricity consumption range can be reached effectively. In this study, we are proposing a web application with two separate user views, admin and consumer.

The web application provides electricity providers and electricity traders to visualize the variation of electricity demand and electricity consumers to personalized consumer profiles.

Keywords electricity consumption; forecasting; regression; random forest regression; decision tree regression, gradient boosting regression

Electricity a major need that is produced and consumed simultaneously. For the past century, the need and significance of forecasting consumer consumption trends on the market have become a much-debated subject. The demand for electricity has continuously increased in Sri Lanka. According to a survey done by the Public Utilities Commission of Sri Lanka in 2018, the total electricity consumption was 13.

2 billion units in 2017 [1]. Most of the countries like Sri Lanka, relying on various weather factors, are altering their living standards and other weather-dependent electricity consumption. Because energy demand is heavily influenced by fluctuations in weather demand patterns are probable to be influenced. Forecasting electricity consumption along with weather factors is, therefore, a significant element in the strategic planning of electricity suppliers. Not only does the forecast provide the expected amount of electricity consumption needed, but it also helps manage reserved electricity for emergency use.

Consumers play an active part by dynamically altering their consumption and altering consumer behavior by planning their home appliances under weather conditions. Therefore, it is important to improve consumer awareness of efficient electric power consumption. Consumption conduct will, therefore, become outdated and the electricity provider will need fresh sophisticated techniques for forecasting consumption and calculating dynamic load profiles.

Various methods and strategies have been developed for the prevision of electricity consumption [2]- [6]. Most of the built models are based on an artificial neural network and analysis of regression. Most analyzes are considered as more of mere omnipresent variables.

Regression analyses are a set of statistical methods used to assess the relationship between variables. Multiple techniques are involved in modeling and analyzing various variables when concentrating on the association between a dependent variable and one or more independent variables.

The research focuses on forecasting Sri Lankan power consumption through the use of multiple regression models based on information on consumption and weather. Each model performance was evaluated by using several measurements such as Root Mean Square Error and R Squared Value and the best model was selected.

The demand for electricity is very high. Therefore, the supply and demand must be managed appropriately by electricity providers. Forecasting the coming demand and supply will be important to maintain supply and demand accordingly. Hence, the incorrect estimation of electricity demand may cause many problems, the value of accurate forecasting is increasing. The study focuses on to provide a more accurate predictive model to forecast the upcoming electricity demand.

Prior to the research, a literary survey was conducted on current platforms with about the same capacity and features. Many authorities have recently discovered energy efficiency programs and policies.

The paper, Artificial neural networks for daily electricity demand prediction of Sri Lanka [2] discusses the study on predicting the next day electricity demand of Sri Lanka. The analyses implemented based on Artificial Neural Network (ANN) and Multiple regression.

Evaluation and Forecasting of Long Term Electricity Consumption Demand for Malaysia by Statistical Analysis [3], the study discussed an approach to understanding the factors affecting electricity demand and forecast electricity consumption for Malaysia.

The paper, Forecasting electricity consumption: A comparison of regression analysis, neural networks, and least squares support vector machines [6] focuses on comparing several predictive models and selecting the optimal model for forecasting electricity consumption in Turkey.

Ceylon Electricity Board (CEB) annual publications [4] presents the annual statistical analyses results for both electricity consumption and generation data.

Most of the existing studies are based on historical consumption data only. Therefore, the proposing study is based on forecasting electricity demand with weather factors to provide a more accurate solution. There is no proper mechanism to increase the consumer awareness towards efficient electricity consumption.

Based on the literature review there are several requirements have been identified. They are forecast electricity demand with weather factors, introduce personalized consumer profiles and provide attractive visualization for analyses results.

The process of the study divided into several main stages named data collection, data preprocessing, model training, model evaluation and visualization.

Data for the monthly electricity consumption were obtained from the Ceylon Electricity Board from year 2015.01.01 to 2019.01.01. The consumption data consists of Time, Bill Cycle, Area, Account Number, Days, Average Consumption, Charge, Outstanding and SIN Number as independent variables and Consumption (units) as the dependent variable. Precipitation amount (millimeter), air temperature (0C), pressure (millimeter of mercury) and relative humidity (%) used as the weather factors for the consumption analyses.

Consumption data and weather data are analyzed; hence data merging works are done on data. The collected data were cleaned and preprocessed by using several python libraries.

Identify the variables affecting electricity consumption patterns through literature study and descriptive analysis (Identify factors linked to the conduct of electricity consumption patterns and check any connection between consumption and weather changes using sample information collection). Performed regression analyses with multiple variables on processed data to investigate the consumption of electricity.

Decision Tree Regression discovers object features and builds regression models in tree structure to predict data.

Gradient Boosting Regression is an ensemble regression model which converts weak learners into strong learners. GBR identify weak learners by using loss function gradient.

Multilinear Regression Model is used to measure relationship between two or more independent variables and a dependent variable.

The first order linear model is given below:

?0 and ?1 – Model parameters

? – Error factor

An ensemble model which uses bagging and various deciding tree, to conduct regression and classification.

Model evaluation done by based on several measures named Root mean squared error (RMSE), Mean absolute error (MAE) and R2 value.

The standard deviation of the residuals (prediction errors). RMSE measures how concentrated the data is around the best fitted line.

Defines average of the absolute errors (absolute difference between true and predicted values).

A statistical measure which defines how close the data to the fitted regression line.

Model validation works are performed on Multiple linear regression, which is identified as the optimal model for the data set. Check assumptions for built model [7];

Relationship between independent and dependent variables should always be linear.

The desired outcome is that points are symmetrically distributed around a diagonal line in the observed vs. predicted values plot or around a horizontal line in the residuals vs. predicted values plot.

Figure SEQ Figure * ARABIC 3: Observed vs. Predicted Values and Residuals vs. Predicted Values

Mean of residuals should be zero or close to zero as much as possible.

Breusch-Pagan test, the null hypothesis assumes homoscedasticity. Obtained results are shown below;

Web application implementation is done by using python based web framework named Django. The web application will provide descriptive and predictive analyses results in more attractive way. The web application contains a customer view which provides personalized consumer data and a admin view which provides consumption data.

Figure SEQ Figure * ARABIC 4: Customer View

Figure SEQ Figure * ARABIC 5: Admin View

The study mainly focuses on building an accurate predictive model for forecast electricity consumption with weather and implementing web application to visualize the descriptive and predictive model results. The optimal forecasting model has been selected by comparing measurements RMSE, MAE and R2 values. The selected model was satisfied all the model assumptions.

Further works can be done to improve the accuracy of the forecasting model with more historical data and more weather factors.

First and the foremost author would portray gratitude to Dr. Windhya Rankothge for the guidance, support and the direction. The author would like to convey her appreciation to the Ceylon Electricity Board for providing the data and to all the individuals who contributed to the effective completion of this research document, either directly or indirectly.

 

Apple Company Annual Report Analysis

The course project began with the selection of a publically traded company. Apple Inc. was chosen for this portion of the project. Apple Inc.’s business operations and the market it is a part of had to be evaluated. Apple’s annual report for 2018 was used to evaluate economic value added (EVA) which is the excess of net operating profit after taxes (NOPAT) over capital costs. (Brigham, 81). The annual report was also used to evaluate Apple’s free cash flow (FCF), which is defined as “the amount of cash that could be withdrawn without harming a firm’s ability to operate and to produce future cash flows.

” (Brigham, 77). “The return on total assets (ROA) which is determined by the ratio of net income to total assets and the return on common equity (ROE) which is determined by the ratio of net income to common equity” (Brigham, 114) had to be analyzed.

Week two’s portion of the course project was designed to calculate the peak car payment and mortgage payment that could be afforded in the given scenario.

The task also required that an amortization table be created to show the relationship between principle payments and interest payments and how they affected the total balance owed on the vehicle being financed. The amount of interest expended over the life of the loan was calculated. The summary was stated that the buyer could afford a more expensive vehicle if they had a higher down payment or made more money per month. They could purchase a more expensive vehicle with the same down payment and the same monthly payment if they wanted to extend the years the vehicle was financed from four to six years.

Week three’s portion of the course project, the purchase of stock via online brokerage accounts and the use of dividend reinvestment plans or mutual or index funds were evaluated. (SU, 2019). An analysis was completed in regards to online trading sites and their requirements for trading stock. There are many online sites where an individual can purchase and sell stocks. Each company has their own set of requirements for trading stocks through them. There are some online options that are said to be great for beginners in the investing market who have limited money to invest. There are others that are more appropriate for seasoned investors who have more money to invest.

Week four’s portion of the course project was analyzing the yields and maturity rates for government securities. Government securities are promissory notes or investment bonds that are sold and secured by the government. “Government securities come with a promise of the full repayment of invested principal at maturity of the security.” (Chen, June. 2019). The three government securities that will be discussed for this assignment are US Treasuries, Municipal Bonds, and Corporate Bonds. These types of investments for consumers are similar in some ways and different in others. People who are low-risk investors tend to be more comfortable investing in government securities because they are backed by the government. Having the government to back up the investments lead investors to believe that they will actually make some money without taking the risk of losing their initial investment. We had to justify which type of security we would hold as individuals relative to the interest rate risk we were comfortable taking. It can be noted that in all three types of securities the values have continually decreased since 2018.

Week five’s portion of the course project is a summarization of the course itself. It is an overview of what we have learned through the readings, discussions, and projects throughout the course. An additional explanation of how the amortization schedule could play a role in an organization is included in the final project. It is a compilation of the coursework that we have completed over the last five weeks. Amortization Chart

The amortization chart that was created in week two was a representation of how interest affects the overall balance of a funded purchase or loan. I personally work in car sales. For my organization, I actually use amortization charts quite often. They are extremely useful when explaining to customers how a larger down payment can save them a lot of money over time. It also allows me to show them how much they can expect to pay in interest if they pay the loan as agreed. The chart is also great for showing customers how much money they can save over the term of the loan if they simply pay an extra $20-$25 a month on their payment. People are amazed and sometimes even angered when they see how much they will be paying in interest for a car loan. For example, if I have a customer come in that wants to purchase a $26,000.00 car with a $2000.00 down payment, an 11% interest rate, and financed for 60 months their payments will be $561.39 a month. However, on the same vehicle if they pay $5000.00 down and the rest of the parameters stay the same their payment will drop to $496.16 per month. That is a savings of $3913.80 over the life of the loan. This also helps customers decided if paying the extra up front is worth the sacrifice in the long run.

There are many online sites where an individual can purchase and sell stocks. Each company has their own set of requirements for trading stocks through them. There are some online options that are said to be great for beginners in the investing market who have limited money to invest. There are others that are more appropriate for seasoned investors who have more money to invest.

Merrill Edge is an online investment company that falls under its parent company Bank of America. Merrill Edge charges $6.95 per trade. They do not have a minimum investment requirement. Merrill Edge is considered the best site for current Bank of America customers and customers who are considered high-balance customers. They offer a wide array of stocks, bonds, options, and ETF trades. The commission fees can be waived after a certain number of monthly trades and the tier the investor is part of. There are commission fees charged for any trades below the required minimums. Other fees and costs may be incurred. (BAC, 2019).

TD Ameritrade is another online trading platform that has gained popularity in recent years. “TD Ameritrade is a good fit for new investors because it offers options for beginners, long-term investors, and active traders,” (TD Ameritrade, 2019). They have been in business more than 40 years. They have the option for new investors to make commission free trades for 60 days. Otherwise their cost per trade is $6.95 the same as Merrill Edge. They have options for investors to purchase stocks, bonds, CDs, options, mutual funds, and over 300 EFTs that are commission free.

InteractiveBrokers is an online investing platform that is a little different than most. They do not have a set fee per trade; on stock options they charge USD 0.000119 quantity stock sold with a maximum of $5.95 per trade. They also charge a 1% fee on deposits that are required to be no less than $50.00. They also charge commissions on trades that are a minimum of $0.35 per share and up to 1% of the total value of the trade. (InteractiveBrokers, 2019). InteractiveBrokers also offers other options for investing such as stocks, bonds, and futures.

Another type of investment that individuals can participate in is a dividend (portion of profits paid to shareholders) replacement plan (DRIP). “With DRIPs, the dividends that an investor receives from a company go directly towards the purchase of more stock, making the investment in the company grow little by little.” (Beers, 2018). DRIPs are shares that come directly from the company that the individual is already invested in. The investors are given the option to received their dividends in payment form or to have them reinvested to gain more shares of stock over time. This is a great option for some because they are obtaining more shares without having to pay more money out of pocket.

AbbVie (ABBV) is in the pharmaceutical market. They have direct purchase plans (DPP) and DRIPs. There is a minimum initial investment for non-shareholders to participate in the DPP. Investors must either pay $250.00 or enroll in ten monthly deductions of $25.00 each for their initial investment. Investors are also required to pay a one-time $10.00 enrollment fee to establish a new account. (AbbVie, 2019). The ABBV DRIP offers established shareholders the chance to buy more shares, commission-free, through automatic dividend reinvestment. (AbbVie, 2019). ABBV has 46 years of consecutive increases in dividend growth. The annual growth for 2018 was 40.2%. (Dividend, 2019). Their current dividend payout is $1.07 quarterly which is an increase of $0.11 over 2018 quarterly amount of $0.96.

JPMorgan Chase & Co (JPM) is in the financial services industry. JPM requires and initial investment of $250.00 which is the same as ABBV. However, under their DRIP their minimum purchase has to be at least $50.00 and can be no greater than $10,000.00. JPM also charges an account setup fee of $15.00 which is $5.00 more than the setup fee charged by ABBV. Their quarterly payout right now is $0.80 which is $0.27 lower than ABBV. There is no reinvestment fee that has to be paid by investors. (DRIP, 2019).

Cisco Systems, Inc. (NASDAQ:CSCO) announced a $25 billion repurchase plan in February of 2019. (Ryle, 2019). CSCO is in the computer hardware industry. In order to participate in their DRIP investors must invest a minimum of $50.00 and their investments cannot exceed $250,000.00 per year. They must purchase at least 1 share in the company to qualify for participation in DRIP. CSCO charges an investing fee of $5+$0.05/share. They also charge a DRIP of 5% of the amount reinvested up to $3.00. (directinvesting, 2019). CSCO may require a lower minimum investment than BAC, but BAC does not charge a DRIP fee and CSCO does. CSCO does not offer OCP to its investors either.

T-Notes and T-Bonds: US Treasuries offer many different notes and bonds. One note that is offered is the Treasury notes (T-Notes). T-Notes have different maturity terms depending on the face value of the note. For example, two and three-year notes have $5,000 face values. (Chen, 2019). Yields on T-Notes are not static. Another US treasury security is Treasury bonds or (T-Bonds). “(T-Bonds) have maturities of between 10 and 30 years. These investments have $1,000 face values and pay semiannual interest returns. The government uses these bonds to fund deficits in the federal budget.” (Chen, 2019).  According to the text the Pure Expectations Theory (PET) is a theory that states that the shape of the yield curve depends on investors’ expectations about future interest rate. (Brigham, 208). If that holds true for T-Bonds, the yield curve would be curving downward because looking at the rates for three different dates that continually decrease would not give investors much hope of an increase in rates in the near future.

Municipal Bonds: “Municipal bonds (or “munis” for short) are debt securities issued by states, cities, counties and other governmental entities to fund day-to-day obligations and to finance capital projects such as building schools, highways or sewer systems.” (USSEC, 2019). Municipal bonds are government securities; however, unlike T-Bonds, they do not have to be issued by the federal government. Like other bonds, munis are a debt for the government entity because that entity has to pay out interest payments to the investors that purchase the munis. The yield rates on munis are also currently in a downward trend. However, munis do not appear to be decreasing as fast as T-Bonds. If PET holds true for munis, the yield curve would be curving downward because looking at the rates for today versus last week, last month, and last year the rates continually decreasing would not give investors much hope of an increase in rates in the near future.

Corporate Bonds: Corporate bonds are issued by a company or corporation as a debt security for the corporation. A corporate bond is riskier than a T-Bond or a muni because it is not backed by the government. The investor is taking a gamble on the credit worthiness and future profitability of the company or corporation issuing the bond. “Corporate bonds are a form of debt financing. They can be a major source of capital for many businesses, along with equity, bank loans and lines of credit.” (Chen, Feb. 2019). Although the risk is higher for corporate bonds in comparison to government backed bonds, the yield is also higher. For an investor willing to take the greater risk, there is a possibility of a greater reward as well. For me, as an investor, I would be more inclined to take the greater risk and invest in corporate bonds. I am only 21, if I take a risk now and it ends badly, I still have time to recover the loss. For older people who are retired, a T-Bond or T-Note would pose the least risk for them. Investing is a personal choice. An individual has to research the market and determine their own personal preference in regard to risk and return.

All of the projects are intertwined in the world of finance. From calculating ROA and ROE, the interest rates and payments on personal debt, and the cost of investing, it is all about money. It is about how to determine what you have, what you need, and what you can afford. Finances look different from a corporate and a personal perspective. ROA and ROE are corporate calculations and evaluations. The interest paid on personal debt is revenue for the lender and an expense for the borrower. The purchase of stock by an individual increases the companies purchasing power while it decreases the cash balance for the individual. The risk in purchasing stocks lies with the one who owns it. If the company does not remain profitable, the investor can lose every dollar they have spent on shares of the company. The company has already used the investor’s money to try to increase or better their business. Sometimes it works for all involved and sometimes it does not work for either party. It is about investments and securities and risk management. Government securities are the investment of choice for many if not most risk adverse investors. People tend to trust that the government is a better investment than most private corporations.

Investing is a leap of faith in that an individual must believe that their investment will increase in value and make them wealthier in the process. If they did not believe in the dream of a better future, they would not risk spending their money on the investment to begin with. As we have learned there are many ways to get into the investment game. Now investments can be made from the privacy of your own home via an online broker. There are also some companies that offer DSSP as a means of purchasing shares directly from the company. There are also many companies who offer DRIPs as an option to cash payouts on dividends. Investors who do not need the money right now can opt to reinvest their dividends and purchase more shares. Anyone who is thinking of investing in stocks needs to do their research and find the company and the method that is a best fit for them.

AbbVie. (2019). Shareholder Information. AbbVie Investors. Retrieved from (2019). Streamline Your Investing. Merrill A Bank of America Company. Retrieved from B. (2018 March 20). What is a DRIP? Investopedia. Retrieved from E. Fundamentals of Financial Management, Concise Edition. [South University]. Retrieved from

Chen, J. (2019 February 23). Corporate Bond. Investopedia. Retrieved from J. (2019 June 25). Government Securities. Investopedia. Retrieved from (2019). Cisco Systems Inc. (CSCO). A reliable investment strategy: Identify a widely diversified portfolio of high-quality stocks and build up additional holdings at favorable prices. Retrieved from (2019). AbbVie Inc. Retrieved from Advice. (2019). JP Morgan Chase (JPM) DRIP. Retrieved from (2019) Other Fees. Retrieved from University. (2019). Week 3 Project. FIN2030 Foundation of Financial Management. Retrieved from

TD Ameritrade. (2019). Investing Built Around You. Retrieved from (2019). Municipal Bonds. What are Municipal Bonds? US Securities and Exchange Commission. Retrieved from