Jump to content

Talk:Algorithmic efficiency

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

title

[edit]

Better title needed. The word "efficient" is a common one used in many different contexts. --mav

What about 'efficient' and 'effective' The german wikipedia makes a distinction between both. If there is such a distinction in english (I do not know, I am no native speaker) then you should probably mention it.

efficiency/effectiveness

[edit]

moved this from the article:

The term "efficient" is very much confused and misused with the term "effective", though a scientific impact takes place. Efficiency is a measureable, quantitative concept, given by the equation: Efficiency = Output/Input (which is same as the concept productivity); or alternatively Efficiency = Output/Predetermined expectation. Whereas Effectiveness is a vague, almost non-quantitative concept, mainly concerned with achieving objectives. --212.72.25.10 08:01, 17 November 2005 (UTC)"[reply]

perhaps created because of the redirect from effective? I'm changing the redirect to point to effectiveness which already exists. --naught101 06:47, 18 November 2005 (UTC)[reply]

Improvement Drive

[edit]

Time management is currently a candidate on WP:IDRIVE. Support it with your vote if you want to see this article improved to featured status.--Fenice 07:51, 5 January 2006 (UTC)[reply]

Optimality assumption

[edit]

A common assumption is that there is always a tradeoff between time and space in an algorithm, but that is only true if it is actually on the optimal curve of time vs space. It is quite possible (and common) for any given algorithm to be quite non-optimal, in which case both its time and space consumption can be reduced, bringing it closer to the optimal curve. MikeDunlavey 19:20, 29 May 2007 (UTC)[reply]

*Algorithmic* Efficiency

[edit]

Looking through the history, this used to be a good page. But it's now bloated with dozens semi-related topcs (e.g., the compiler and hardware optimizations). Those topics all have their own pages; they don't belong here. 128.84.98.73 (talk) 20:20, 14 May 2010 (UTC)[reply]

Agree completely, anything related to optimisation shuold be elsewhere. Murray Langton (talk) 05:32, 15 May 2010 (UTC)[reply]
When discussing the efficiency of algorithms in general, it would be extremely dumb to only consider what can be done to improve a pre-existing algorithm. Are you suggesting that programs should be produced first, without any regard for how well they are likely to perform, and then - once they are "working" - go back and see how they can be improved by optimization? If you are, then consider this:-
Imagine a 10 man-year project with 10 programmers working on it, blindly coding away for 9 months, only to later discover - during testing - that it all ran much too slowly and had to be completely re-written to improve its efficiency. The oft-repeated phrase "premature optimization" has a lot to answer for, and has been used in isolation and completely out of cotext to justify not caring how efficient an algorithm is from the start. As for "semi-related topics", when it comes down to efficiency, all things have to be considered simultaneously. A racing car designer has to consider tyres, suspension, engine capacity, fuel, airflow, safety, gears, brakes as a complete package (not forgetting the driver and his interfaces). Similarly, a software engineer should consider all "semi-related" but contingent topics during the design phase or pay the price later. The good software engineer should, just like his mechanical counterpart, design and build optimally. There are rare exceptions, such as one-off programs - but even these have an inconvenient habit of being re-used again and again. —Preceding unsigned comment added by Kdakin (talkcontribs) 06:01, 8 June 2010 (UTC)[reply]
A lot of this article seems to be concerned with 'Optimization techniques' rather than the intrinsic efficiency of an algorithm. Most of that text should be moved either into Program optimization or into Optimizing compiler. Murray Langton (talk) 20:59, 13 May 2013 (UTC)[reply]

Algorithmic olympics

[edit]

This article relates to the subject and is very interesting: wired magazine--Billymac00 (talk) 05:18, 8 January 2011 (UTC)[reply]

So why don't you (or somebody else) insert this as an external link with perhaps a note in the main text of the article about algorithm competitions (new section?). — Preceding unsigned comment added by 81.157.168.203 (talk) 16:37, 18 December 2011 (UTC)[reply]

Error (I think)

[edit]
For example, a condition might test patients for (age > 18) before testing (blood type == 'AB-') because this type of blood occurs in only about 1 in 100 of the population. This would eliminate the second test at runtime in 99% of instances; something an optimizing compiler would almost certainly not be aware of - but which a programmer can research relatively easily even without specialist medical knowledge.

I think this is the wrong way around isn't it? The blood type check should be done first to avoid the age test (the second part of the statement seems to suggest it is meant this way around). But maybe I am just confused. --81.149.74.231 (talk) 16:23, 21 June 2012 (UTC)[reply]

FFT speedup

[edit]

This section claims that advances in FFT algorithms 'may' increase processing speeds by a factor of 10,000 or so. Not only is this incorrect, but it's also mentioned nowhere in the articles it references (reference 13 as of this reading). While it seems to be a good article to link to on algorithmic efficiency, the comment written on Wikipedia about the article seems blatantly wrong. Or is, at the least, too unsupported for me to verify when checking both the attached paper from MIT and the news article on MIT's website. — Preceding unsigned comment added by 75.149.152.50 (talk) 18:48, 18 January 2013 (UTC)[reply]

Kolmogorov complexity

[edit]
Essentially this implies that there is no automated method that can produce an optimum result and is therefore characterized by a requirement for human ingenuity or Innovation.

This is nonsense.

A computer is perfectly capable of systematically searching for the shortest representation of some input in some specific language (see also superoptimization), and informing the user of the best result found so far (i.e. an upper bound on the Kolmogorov complexity). Since the search space is finite, given sufficient time and memory it will eventually find and output an optimal solution. However, once it has done so, it may nonetheless continue to run forever evaluating some remaining shorter candidate representations which, unknown to the algorithm, will in fact never halt. This is the reason the Kolmogorov complexity function is officially not computable: it is not guaranteed to be able to determine its best solution so far is optimal, which is required to be able to claim it has calculated the Kolmogorov complexity.

There is no requirement for human ingenuity or innovation here, just a requirement for human impatience to declare the latest solution to be good enough, and abort the (potentially futile) search for a better one. Note that in reality, there is probably also a limit on how long the evaluation of a representation is permitted to take anyway, and if such a limit is imposed then finding the optimum is computable.

xmath (talk) 02:38, 5 May 2013 (UTC)[reply]

Proposed rewrite

[edit]

Over the next few month or so I intend to do a major rewrite on this article.

The most major change will be to remove all the optimization techniques which properly belong elsewhere; if they are not in the other relevant articles then I will move them there.

Once I've got rid of these, I'll then move on to the actual rewrite, hopefully trying to add references as I go.

Any comments on the above plan, please make them soon.

Murray Langton (talk) 16:31, 14 May 2013 (UTC)[reply]

Gal-Ezer and Zur

[edit]

why are they quoted on the opening paragraph? who heard of them? and what the point of the quote itself? telling the obvious, that efficiency is important? — Preceding unsigned comment added by Uziel302 (talkcontribs) 02:02, 20 August 2013 (UTC)[reply]

Moved to links. Uziel302 (talk) 06:40, 21 August 2013 (UTC)[reply]

Decimation of original article

[edit]

Many of the highly relevant points in the original article have been completely removed. The latest offering for example assumes that it is improper to consider optimal methods before and during program creation and instead relegate these to "optimization" (i.e. a later "fix") articles. The latest offering seems to suggest that human inventiveness can be superseded by computation in finding a faster algorithm. Clearly he is an AI believer (and wrong)! — Preceding unsigned comment added by 86.138.82.181 (talk) 09:36, 1 June 2014 (UTC)[reply]

The title of this article is 'Algorithmic efficiency', which implies it is about the efficiency of algorithms. There are lots of separate articles related to 'program optimisation', so matters relating purely to optimisation have been removed from the article. According to Knuth: "Premature optimization is the root of all evil (or at least most of it) in programming." Murray Langton (talk) 08:25, 2 June 2014 (UTC)[reply]

Energy complexity

[edit]

Should there not be a section of energy complexity? See for instance Roy et al (2012).[1] A google search will also reveal a lot of hits on the term "algorithmic complexity energy". I would add this material in myself but it is a bit outside my area of expertise. Regards. RobbieIanMorrison (talk) 12:13, 13 July 2016 (UTC)[reply]

Energy complexity is strongly correlated to hardware (which is exactly what complexity models are trying to avoid, providing only a accurate model for a specific hardware requires creating a new model each time).
It also is strongly connected to time complexity, the time taken is tied to the number of operations performed and conversely energy consumed. Due to the fact that processors always require a base power supply, an energy efficient model is to simply maximize throughput and shutdown as quickly as possible. Therefore energy complexity correlates with time complexity.
The paper you linked is interesting as a more formal way to approach it that idea. but I'm not sure it's really adding anything. JSory (talk) 10:42, 12 April 2023 (UTC)[reply]

References

  1. ^ Swapnoneel Roy; Atri Rudra; Akshat Verma (2012). An Energy Complexity Model for Algorithms (PDF). Retrieved 2016-07-12.
[edit]

Hello fellow Wikipedians,

I have just modified one external link on Algorithmic efficiency. Please take a moment to review my edit. If you have any questions, or need the bot to ignore the links, or the page altogether, please visit this simple FaQ for additional information. I made the following changes:

When you have finished reviewing my changes, you may follow the instructions on the template below to fix any issues with the URLs.

This message was posted before February 2018. After February 2018, "External links modified" talk page sections are no longer generated or monitored by InternetArchiveBot. No special action is required regarding these talk page notices, other than regular verification using the archive tool instructions below. Editors have permission to delete these "External links modified" talk page sections if they want to de-clutter talk pages, but see the RfC before doing mass systematic removals. This message is updated dynamically through the template {{source check}} (last update: 5 June 2024).

  • If you have discovered URLs which were erroneously considered dead by the bot, you can report them with this tool.
  • If you found an error with any archives or the URLs themselves, you can fix them with this tool.

Cheers.—InternetArchiveBot (Report bug) 14:56, 1 July 2017 (UTC)[reply]

Even / Odd poor example

[edit]

I removed the example of finding if a number is even / odd. It isn't a good example, since N = 1. A better example would be one where N can vary. I added an example of finding the median from a sorted list of numbers. SlowJog (talk) 22:38, 3 October 2017 (UTC)[reply]

A Triplet of Horrid Articles

[edit]

This article, time complexity and analysis of algorithms all have very strong overlap due to confusion over what they are, despite in fact (as implied) being distinct topics. It doesn't help that sections of the articles are written stream-of-conscious-like, and even worse vague meaningless lectures that have no relevance to the actual computational models. WP:NOTTEXTBOOK JSory (talk) 10:23, 12 April 2023 (UTC)[reply]

I agree. I am new to editing and Wikipedia suggested this page to me for copyediting, but I know enough about this subject to know that this page is such a mess that copyediting would likely be a waste of time, but not enough to be confident in rewriting it. Maybe someday I will.
Computational complexity theory looks a bit better and it seems like this page should probably link to that one for all that it explains and then focus more narrowly on efficiency without trying to explain complexity. Or at least limit the explanation to a very short section that refers to the larger article.
There is another confusion between that article and Computational Complexity but it looks like Computational complexity theory is the better one. Crendrik (talk) 06:28, 21 February 2024 (UTC)[reply]