Saturday, February 11, 2017

Хоёр Шуудай

Architectures for Software Systems нэртэй, унших материалаар дарсан хичээлийг энэ улиралд авч байгаа бөгөөд өнөөдөр багшийн ярианаас сонссон нэг сонирхолтой хошигнолыг энд тэмдэглэж үлдээе гэж бодлоо. Програм хангамжийн архитект 1) азаар дүүрэн нэг шуудай, 2) хоосон дахиад нэг шуудай, нийт хоёр шуудайтай ажлын гараагаа эхэлдэг аж. Гол зүйл энд юу вэ гэхээр, эхний шуудай хоосрохоос өмнө хоёр дахь шуудайг туршлагаар дүүргэх ёстой аж. Ингэж чадвал, сая өөрийгөө архитект болох зүг чигээн олж байна гэж дүгнэх хэрэгтэй гэнэ :) Тоглоом болгож хэлсэн ч гэсэн, үнэний жинтэй үгс гэж бодном.

Sunday, January 22, 2017

Quote of the day

Always design a thing by considering it in its next larger context—a chair in a room, a room in a house, a house in an environment, an environment in a city plan.

Gottlieb Eliel Saarinen (1873–1950)

Saturday, January 14, 2017

When Bad Requirements Happen to Good People

Super-famous and very good-looking authors Karl Wiegers and Joy Beatty did a fabulous job of describing this problem in a section titled "When Bad Requirements Happen to Good People" from their amazing book Software Requirements, Third Edition (Microsoft Press, 2013), reprinted here with permission:

When Bad Requirements Happen to Good People
The major consequence of requirements problems is rework—doing again something that you thought was already done—late in development or after release. Rework often consumes 30 to 50 percent of your total development cost, and requirements errors can account for 70 to 85 percent of the rework cost. Some rework does add value and improves the product, but excessive rework is wasteful and frustrating. Imagine how different your life would be if you could cut the rework effort in half! Your team members could build better products faster and perhaps even go home on time. Creating better requirements is an investment, not just a cost.
It can cost far more to correct a defect that’s found late in the project than to fix it shortly after its creation. Suppose it costs $1 (on a relative scale) to find and fix a requirement defect while you’re still working on the requirements. If you discover that error during design instead, you have to pay the $1 to fix the requirement error, plus another $2 or $3 to redo the design that was based on the incorrect requirement. Suppose, though, that no one finds the error until a user calls with a problem. Depending on the type of system, the cost to correct a requirement defect found in operation can be $100 or more on this relative scale. One of my consulting clients determined that they spent an average of $200 of labor effort to find and fix a defect in their information systems using the quality technique of software inspection, a type of peer review. In contrast, they spent an average of $4,200 to fix a single defect reported by the user, an amplification factor of 21. Preventing requirements errors and catching them early clearly has a huge leveraging effect on reducing rework.
Shortcomings in requirements practices pose many risks to project success, where success means delivering a product that satisfies the user’s functional and quality expectations at the agreed-upon cost and schedule.

Tuesday, January 3, 2017

Forgetting

We’ve talked about encoding, storage, and retrieval, the first three steps of declarative memory. The last step is forgetting. Forgetting plays a vital role in our ability to function for a deceptively simple reason.  Forgetting allows us to prioritize. Anything irrelevant to our survival will take up wasteful cognitive space if we assign it the same priority as events critical to our survival. So we don’t. At least, most of us don’t. 
Solomon Shereshevskii, a Russian journalist born in 1886, seemed to have a virtually unlimited memory. Scientists would give him a list of things to memorize, usually combinations of numbers and letters, and then test his recall.  Shereshevskii needed only three or four seconds to “visualize” (his words) each item. Then he could repeat the lists back perfectly, forward or backward—even lists with more than 70 elements. In one experiment, developmental psychologist Alexander Luria exposed Shereshevskii to a complex formula of 30 letters and numbers. After a single recall test, which Shereshevskii accomplished flawlessly, the researcher put the list in a safe-deposit box and waited 15 years. Luria then took out the list, found Shereshevskii, and asked him to repeat the formula. Without hesitation, he reproduced the list on the spot, again without error.
Shereshevskii’s memory of everything he encountered was so clear, so detailed, so unending, he lost the ability to organize it into meaningful patterns. Like living in a permanent snowstorm, he saw much of his life as blinding flakes of unrelated sensory information. He couldn’t see the “big picture,” meaning he couldn’t focus on the ways two things might be related, look for commonalities, and discover larger patterns. Poems, carrying their typical heavy load of metaphor and simile, were incomprehensible to him. Shereshevskii couldn’t forget, and it affected the way he functioned. 
We have many types of forgetting, categories cleverly enumerated by researcher Dan Schacter in his book The Seven Sins of Memory. Tip-of-the-tongue lapses, absentmindedness, blocking habits, misattribution, biases, suggestibility—the list doesn’t sound good. But they all have one thing in common. They allow us to drop pieces of information in favor of others. In so doing, forgetting helped us to conquer the Earth.

Source:

Tuesday, December 27, 2016

Evolution of an Algorithm

In linear algebra, the Coppersmith–Winograd algorithm, named after Don Coppersmith and Shmuel Winograd, was the asymptotically fastest known matrix multiplication algorithm until 2010. It can multiply two  matrices in  time. This is an improvement over the naïve  time algorithm and the  time Strassen algorithm. Algorithms with better asymptotic running time than the Strassen algorithm are rarely used in practice, because the large constant factors in their running times make them impractical. It is possible to improve the exponent further; however, the exponent must be at least 2 (because an  matrix has  values, and all of them have to be read at least once to calculate the exact result).
In 2010, Andrew Stothers gave an improvement to the algorithm,  In 2011, Virginia Williams combined a mathematical short-cut from Stothers' paper with her own insights and automated optimization on computers, improving the bound to  In 2014, François Le Gall simplified the methods of Williams and obtained an improved bound of 
The Coppersmith–Winograd algorithm is frequently used as a building block in other algorithms to prove theoretical time bounds. However, unlike the Strassen algorithm, it is not used in practice because it only provides an advantage for matrices so large that they cannot be processed by modern hardware.

Source: Wikipedia, Coppersmith–Winograd algorithm