Over time, companies grow, diversify, acquire, change strategies, reorganize, rename, or restructure. As this happens, the nature and content of the data in their systems changes, too. For example, customer coding systems used by headquarters teams may radically differ from those used by a newly acquired subsidiary. Database schemes may vary between regions or departments and business units.
Not addressing these issues (or being unaware of them) can have expensive consequences.
Estimating BI project activities is much more difficult than estimating traditional IT projects because no two BI projects are alike. For example, you may use a new tool, work with new team members, or have no experience with a new design method.
Three of the most used estimating techniques (historical, intuitive and formula) expect you to relate to prior project experience:
- The historical estimating technique expects you to have statistics on how long similar projects took in the past—but you may not have had a similar project before.
- The intuitive estimating technique expects you to predict, or guess, based on prior experience how long it will take to complete a similar activity—but you may have never performed a similar activity.
- The formula-based estimating technique expects you to know the longest time it may take to complete an activity, the shortest time, and the most probable time—but you would not know what the longest, shortest, and most probable times for an activity could be if you had never performed that activity before.
In all those cases, it is best to consult with other people (in-house staff or outside, experienced experts) who have already developed an array of BI applications because your own uneducated guesses may be gross underestimates. This also demonstrates how important it is to track actual time on BI projects. You will need that information for estimating your next BI project.
Effort estimates cannot be completed until the activities and tasks are assigned because the estimates must take into consideration each team member’s skills and subject matter experts.
By most accounts, 80% of the development effort in a Big Data project goes into data integration; the rest goes toward data analysis. A traditional enterprise data warehouse (EDW) platform can cost over $60,000 per terabyte. Analyzing one petabyte (the amount of data Google processes in an hour) would cost $60 m.
Get your costs under control while your get the most value from the data you need. Ask a team that does this daily. We put our reputation out there on every project so we must really good both budgeting and controlling ETL Plus project costs.
If you would like a friendly and neutral needs analysis of where you can maximize your ETL Plus and big data management ROI, just shout out and we can be your listening post. We know ETL budgeting and data delivery.