Saturday, 18 June 2011

COMPUTER SERVICES

How would someone find your website if they needed your products or services? They would most likely use a Search Engine such as Google or Yahoo. Over 90% of websites are found in this manner. The importance of placing high in the results page is obvious. The process of taking the steps required to rank high in the major search engines is "Search Engine Optimization" and is part of the internet marketing process.
Digital Science Web Technologies Provides a search engine optimization and marketing that is highly effective to improve traffic to your website. This includes detailed reporting of website traffic, Redesign your website structure and content, submission to important search engines, submissions to directories relevant to your business needs and other strategies designed to improve your website's rank in search engines.
Advantages of Digital Science Web Technologies as a Search Engine Optimization Company
Digital Science Web Technologies Search engine optimization (SEO) service ensure your website is placed highly in major search engines such as Google, Yahoo and MSN along with others. Peoples or prospective clients rarely search beyond the first page of results so it is very critical that your site is listed high for the relevant key words or phrases for your business.
SEO consists of a number of areas such as:
  • Fresh and Unique Web Site Content
  • SEO Friendly Web Page Design
  • SEO Friendly designed Meta tags and HTML
  • Get web site linking from Trusted sites
  • Submission to search engines and Various Directories
We Provide Website Analysis Report
We provide your Website Analysis Report is recommended as a method to determine your websites Present position in major search engine and effectiveness and suggestions about improving your ranking on the major search engines. This comprehensive report shows the following:
  • Page ranking on Google
  • Effectiveness keyword Analysis
  • website's meta tags and content Analysis
  • Current Link popularity
  • Web 2.0 and HTML code errors
  • Current Website Page Loading time check
Submitting your website in search engines may increase your online sales dramatically. lf you invested time and money into your website ,you simply MUST submit your website online otherwise it will be invisible virtual, which means efforts spent in vain. If you want people to know about your website and boost your revenues, the only way to do that is to make your site visible in places where people search for information ,we are  submit your website in multiple search engines.
Digital Science Web Technologies SEO Services:
  • Keyword Analysis
  • Keywords based Optimization Strategy
  • Make your Web pages SEO friendly
  • Meta Tag Creation and Optimization
  • 100+ Search Engine Submission
  • Our expert Try get top ten rankings in major search engines
  • Try for indented listings
  • Competitors Research with Details
  • Keyword Ranking Reports
  • Make your Subpages SEO friendly
  • Link Development
  • Directory Submission
  • Robots.txt and Sitemap creation
  • Google analytics Setup And many more……
Digital Science Web Technologies as a SEO Company in ,India - professional Search Engine Optimization Company specializing in search engine optimization, search engine submission & top10 search engine placement, link exchange & link popularity building services, Pay Per Link services, SEO web design and website development services, web promotion & Internet marketing, and other SEO services.

EXAMINING TECHNOLOGY

While the use of electronic trading systems (ETS) in fixed income (FI) market is becoming more common in the last few years, only a small number of empirical studies have been conducted to examine their adoption and impact in the market. In this study, we use the technology enactment framework (TEF) to identify various institutional and organizational factors that influence ETS adoption. Based on a comprehensive survey and face-to-face interviews with senior managers, sales representatives, and traders from major Canadian financial institutions, we find that the FI market is characterized by a high level of embedded relationships and that the ETS users do not entirely trust their ETS. This study also presents, among others, important findings regarding the impact of ETS on the users' job performance and their perception about the asymmetric nature of benefits provided by ETS in the FI market.
Keywords: Technology enactment, fixed income market, electronic trading systems, technology adoption

Personal Computer Repair and Laptop Repair

Efficient, Reliable, Trustworthy, and On Time!

Infotech Computers has been providing professional computer sales and computer repair services in Toronto for more than ten years. Locally owned and operated, we also offer Express Service, In-Home Repair Services, onsite computer services, and Data Recovery Services. Put your home or business computer in the hands of a A+ rated company with the Better Business Bureau for excellent and reliable Toronto Computer Repair Services.

Lets Get Physical: Inside The PhysX Physics Processor

No details about the internal structure of the PhysX physics processor have been released, however there is a patent [patent] which describes the ideas behind the processor.  The patent itself is a very difficult read as it is describes in detail, a number of complex subject areas in a legalistic style.  This is par for the course for patents but just to make life more difficult it uses seemingly no end of three letter acronyms to describe the various parts of the design. Here’s an example:
“FPE 19 comprises, for example, four Vector Processing Engines (VPE), 19a, 19b, 19c, and 19d,
instead of the configuration shown in FIG. 11, including a SIU, and a plurality of SFU and VFU units. DME
18 further comprises a Switch Fabric 150, five Memory Control Units (MCU, 151a through 151d and 152),
PCI 34 and MIU 50.”
Many of these are not explained at the first use so you have to hunt for the explanation and “ODE”, is not explained at all, perhaps an “Obfuscated Description Elucidator”?  (“Ordinary Differential Equation” in case you’re wondering).
However, hidden in all that is a description of not one, but two potential designs for a physics processor and the software which runs on it (software concepts are described in part 1).  I shall mostly concentrate on describing the second variant of the hardware as it is described as the “presently preferred embodiment”, i.e. this is the version more likely to be built.  Unfortunately I’ll have to use a few three letter acronyms myself but not too many.
Inside The PPU
The PhysX PPU (Physics Processor Unit) chip is made up of 3 engines along with its own memory controller, PCI interface and various I/O ports.
The three engines are:
PCE - PPU Control Engine
DME - Data Movement Engine
FPE - Floating Point Engine
To put it simply:
The PCE controls everything.
The DME moves data in and out of memory.
The FPE does floating point calculations.
The PCE is a conventional RISC processor, which processor is completely unknown but it’s used for tasks which require little computation or bandwidth so it’s not going to be anything exciting.  There are no end of CPU cores available which can be used for this purpose (MIPS, ARM and PowerPC are 3 possible choices but there are many more).  There’s really not much to be said about the PCE as its job is really just to manage the DME and FPE by uploading their programs to them and communicating with the rest of the system.
Both the DME and FPE contain many blocks of RAM.  These are discrete blocks of RAM and are probably not mapped into the main system’s memory.  They will also not be caches, caches take up more room than a block of plain RAM and would have little benefit in this processor, in fact due to the complexity involved using cache in this processor would most likely be a major disadvantage.

The Data Movement Engine (DME)

The PPU has a potentially vast amount of floating point power available, this however is of no use unless all the floating point units can be kept fed with data, the Data Movement Engine is responsible for doing this.
The DME is comprised of 5 memory control units, an external memory controller, a PCI bus interface and a “switch fabric”.
The switch fabric is a network of switches and busses which allow all the different units to talk to one another.  In this case the switch fabric has 7 x 256 bit bidirectional ports, the number of units which can be talking simultaneously is not specified.
The work done in the Data Movement Engine is controlled by a series of 5 memory control units.  4 of these are connected to the Vector Processor Elements in the Floating Point Engine, the 5th is connected to the PPU Control Engine.
Each memory control unit contains a block of RAM and it moves data to and from it.  This will mainly involve passing data to and from the external RAM and the vector processor element it is connected to, with the memory controller unit’s RAM acting as a buffer in between.  They are not limited to this however and can also move data to and from the other memory controller units and the PCI bus.
You may wonder why it doesn’t just move data directly to or from the vector processors but this is done to make the usage of the external memory bus as efficient as possible.  Moving data in big chunks is faster than moving data in small chunks so doing this will increase performance.  Keeping data in on-chip buffers also allows data to be moved around the chip without going to main memory, again saving memory bandwidth.
The connections to the Switch Fabric and the Vector Processor Elements are separate so it looks like two types of communication can be operating simultaneously.  For example, data could be written to one of the Vector Processing Elements while other data is being read in from external memory.
The Floating Point Engine (FPE)
The Floating Point Engine is the part of the PPU which does the real work, it performs all the actual physics calculations.
The FPE is made up of 4 Vector Processor Engines (VPE) and each of these is in turn made up of 4 Vector Processor Units, giving you in effect 16 vector processing cores.
The Vector Processing Units are not normal CPU cores but do contain some of the components normally found inside them along with some decidedly non-standard units.
All the data processing is done on 32 bit values stored in 16 floating point registers or 8 integer registers, there are likely other registers for program control and predication (a technique used in place of branches).
The execution unit appears to do vector processing with 6 elements whereas the normal is 4, this unit also contains a standard integer processing unit.  It is not described in any detail in the patent but if it is anything like variant 1 the execution unit will use a hybrid processing model.  This will issue a single integer instruction and a 6 part vector instruction as a single VLIW (Very Long Instruction Word) instruction.
The Vector Processor Units also contain a set of internal memories one of which is dedicated to storing the program being executed.  There is also an “Inter-Element Memory” which is used to store data for processing.  This is really a pair of memory blocks (A and B).  At one point bank A is accessed by the processor while bank B can be accessed by the Memory Control Unit.  When the processing and any data transfer is complete the access to these memories “switch” and the processor uses bank B while bank A is accessed by the memory control unit.  This technique allows both memories to be accessed at full speed simultaneously, it is in effect a hardware double buffer.

Inside The PhysX Physics Processor

The Alternative And Others

The Other PPU
Part 2 contains a description of the second design of the PPU that is described in the patent.  The first design used completely different DME and FPEs.
As we’ve seen the second version divides the DME into a number of memory control units and the FPE into a number of vector processing units.  The first version did not do any such division in the DME and division in the FPE was limited.
The version 1 DME comprised of a set of address generators, busses, memories and crossbar switches.  It looks nothing even remotely like the memory control units of variant 2.  With 6 address generators running simultaneously to control the movement it looks like it would be truly mind bogglingly complex to program and probably be less flexible.
The variant 1 FPE was loosely divided into a number of vector processors but used a complex series of register banks some of which were shared between the processors and others not.  The actual processing was the same hybrid vector-VLIW method described for variant 2 but it uses a more traditional 4 element vector and included the ability to do both integer and floating point scalar operations alongside the vector computations.
The “preferred” second variant is conceptually a lot simpler than the first version described in the patent and most likely easier to program.  Crucially however, the second variant will be easier to manufacture than the first.  There are always errors on chips causing parts to fail, if the design is divided up faulty parts can be deactivated and the chips sold with less functionality at a lower price, a practice common in the semiconductor industry.
The first version was divided into sub-parts but not to the same degree as the second.  A single fault in the wrong area could conceivably make the chip useless lowering the number which could be sold.
It’s quite conceivable the final product will be different again from both versions described in the patent but I doubt it will change at the highest design level.  Lower level changes are very likely however as design simulations reveal the system’s behaviour and changes made accordingly.  Changes may also be made after prototypes are made but these are likely to be relatively minor.
Differences Between The PPU And Desktop Processors.
The PPU is really quite different to pretty much any conventional processor and is more akin to a GPU (Graphics Processor Unit).  This isn’t surprising given it’s dedicated to physics processing, not general purpose processing.
The programming model for the PPU is also likely to be wildly different from a conventional processor.  In a conventional processor you do everything in a single program and a single memory.  You move data into on-chip registers then do operations on the data, you then write the data back out to memory.  You do need to understand program flow when writing the program but you do not need to explicitly control loading and executing the instructions as this is handled by the processor.
The PPU breaks these operations into parts so there is one program for loading data and another for performing operations on it.  A third program running on the PPU Control Engine is responsible for uploading these programs into their respective memories and instructing the processors which address to start executing from.
While the PPU may sound highly complex to program, this is a problem that will most likely only be faced by Ageia’s driver writers.  Normal developers will program the chip via the PhysX / Novodex API with custom routines most likely be done in a customised language in a similar manner to the way custom shaders are used on GPUs.  The complexity will most likely be for the most part completely hidden - as is the case with GPUs today.
..And The Cell
The PPU is quite different from conventional microprocessors but is similar in some respects to the PS3’s Cell processor, this is not surprising as they are both multicore vector processors.  The are different however in that the Cell was designed as a more general purpose(ish) processor whereas the PPU has been designed to to accelerate a specific type of computation.
There are a number of similarities:
  1.  No cache - the memory arrangement is similar to the “Local stores” in the Cell’s SPEs.
  1.  Processing routines cannot directly access memory.
  1.  No out-of-order processing hardware.
  1.  Designed to operate at as close to its theoretical maximum performance as possible.
  1.  Very high memory bandwidth, hardware designed to make maximum use of it.
There are on the other hand also some considerable differences between the Cell and the PPU:
  1.  The Cell contains a fully fledged Power processor which shall run an OS and applications.  The PCE is likely to run a very rudimentary OS and as such will likely be a low end device such as an ARM core.
  1.  Being designed for Physics only the Instruction set looks likely to be limited.
  1.  Processing operations are all 32 bit floating point with 32 bit integer processing seemingly present only for control purposes.
  1.  No virtualisation - All the cores on the Cell contain memory management units, at most the the PPU might have one in the PCE but even that’s not a given.
The Competition
At the moment no other company has announced plans for a dedicated physics accelerator but that’s not to say there’s no competition.
ATI in particular have been talking about using their GPUs for physics but quite how they’ll manage to do that if the card is already busy with graphics is unknown.  GPUs aren’t designed for physics in particular but they are becoming increasingly programmable these days [D3D10] and they certainly have the floating point capabilities.  Perhaps we’ll see modified or even re-badged graphics cards sold as physics accelerators.
If other cards do appear on the market they will have the problem of which physics API to support, there is no single standard for physics programming and Ageia are unlikely to support direct competitors.  Each card may end up having its own API.  There is hope however in that a widely supported standards group recently released COLLADA v1.4 which includes a Physics API [COLLADA] .  COLLADA could end up as the standard API for physics unless that is, Microsoft decides to invent one.  Interestingly Ageia’s API is mentioned on the COLLADA site so there appear to be at least some interoperability with the standard.
Consoles are not in direct competition with Ageia but are competing with gaming PCs.
The PS3’s Cell processor and the XBox360’s triple core PowerPC were both designed for very high floating point computational capabilities, in both cases this will give the consoles enhanced physics capabilities.  As mentioned above Cell in particular has a similar architecture to the PhysX chip so should do well on game physics.  Indeed Sony seem to be pushing the idea of the PS3 using Cell to do “natural motion” through physics [Natural].
Ageia have been pretty smart in that they are supporting both these consoles through their APIs.  Any game being ported from consoles to PCs can thus relatively easily take advantage of the PhysX chip.  The same is also true in the other direction.
Non-Gaming Uses
While the PPU is designed to physics calculations it’s quite possible that other uses will be found for it.  GPUs are designed for 3D graphics but have found to be useful for many other types of processing.  This is an increasingly popular usage as they can perform many times faster than desktop processors in many instances.
One potential use would be in “physical modelling” synthesisers, these use physics processing to simulate the individual parts of musical instruments and can create strikingly realistic sounds.  Physics is also used in engineering and of course scientists could also potentially find the processor useful.
Conclusion
Consoles will have the raw floating point power necessary for physical simulation, the PhysX will bring that capability to gaming PCs.  It is however another card to add so the price of a gaming rig will move yet further away from games consoles.
Gaming is becoming ever realistic looking and in the future these looks shall be accompanied by increasingly realistic behaviour as well.  While you can expect many to be initially sceptical of the benefit of physics I expect this will change as more and more games make use of the technology and the difference becomes obvious.
Quite if it becomes a standard piece of equipment for all PCs is another matter.  With interfaces beginning to use 3D everyone can now use the 3D hardware, it’s not clear if there is a common use which would allow everyone to make use of a physics accelerator.  
With their massive computing capabilities consoles were looking like they’d get quite ahead of gaming PCs.  Quad SLI set-ups may produce amazing graphics but without the raw physics processing power the games would paradoxically, look unrealistic.  The PhysX will bring these capabilities to PCs and as such will very likely become a standard part of gaming PCs.

The PhysX Physics Processor

3D acceleration is a long established standard part of today’s systems yet it started life as an exotic, expensive add on. Last year Ageia announced a new kind of add on, their PhysX chip is a new technology specifically designed for accelerating physics processing in games. Why should a physics chip interest gamers? What does it do? How does it work?

As games become ever more realistic it’s becoming apparent that some things are not as realistic as they could be. Objects just don’t behave the way they would in real life.

If you fire a rocket at a wall it might leave a scar but the actual wall is likely to remain unchanged otherwise. Even if you manage to blow it up it’s not going to look like a wall that’s been blown up in real life.
If you blow up a building it too may blow up in a decidedly non-realistic manner.
Physics acceleration will change all this.


A Load Of Balls

Why is this?
To explain I’ll give an example:

Lets say you’re on a football (soccer) field and you have three balls:
A beach ball.
A football.
A solid iron ball.

Lets say you give each a healthy kick.
If you kick the beach ball it’ll fly off but then slow down rapidly, it doesn’t even get near the goal posts and the goalie laughs at you.

If you kick the football it’ll fly off but continue for a greater distance, the goalie dives to save it.

In reality you’d probably break your foot kicking a solid iron ball but lets say for arguments sake, you happen to be superman. You kick the iron ball and it flies off and keeps going for a long, long way. The goalie dives out of the way - to save himself.

In a game, unless it has been specifically programmed something else will happen:
You kick the beach ball, it flies off.
You kick the football, it flies off.
You kick the iron ball, it flies off.

In real life the three balls are made of different materials which have different properties. This means they act in very different ways. In the game the ball has no properties so just acts according to the program controlling it, if the program does not explicitly make the different balls behave differently they all act the same.

If you want to create a game which not only looks realistic but also acts realistically, you need to give objects and the environment itself real properties. You have to figure out what happens when the objects interact. Real physical properties and real physical actions - this is what game physics is all about.


Lets Get Physical

Physics in games is all about movement and things bashing into one another. You don’t need to worry about objects sitting on their own doing nothing since they’re not interacting with anything else.

To get the balls to move correctly you need to model not only the weight mass of the ball but also gravity, spin and wind resistance. When you kick the ball you are transferring energy from your foot to the ball and the ball then flies off. The ball will at first resist movement, this is know as inertia and depends on the mass of the ball. By overcoming the inertia you set the ball in motion, in doing so you have given the ball momentum.

When the ball is flying it is subject to wind resistance and gravity, these both fight the momentum and will bring the ball to a stop. Spin will have an impact of the direction of the ball and its direction after it hits something.

The beach ball will have very little momentum since it has low mass and will be subject to higher wind resistance since they are generally large. The football has a higher mass and is smaller than the beach ball, this means it will take more energy to stop it, it will be subject to less wind resistance and it will thus go further than the beach ball.

A solid Iron ball will take a lot of energy to get moving in the first place since it’ll be heavy but once moving it will take a lot of energy to stop it. This means the iron ball will fly for a long way and cause a lot of damage when it hits something. This is why you use heavy things for bombarding the enemy if you are fighting a war. Bombarding someone with beach balls would prove somewhat ineffective, unless you just wanted to annoy them to death. It’s also the reason bullets are made out of lead, a heavy metal does more damage.

There’s more to it than just slowing things down though. Flying objects slow down in a certain way and this changes their path as they fly through the air. If you model air resistance or gravity incorrectly the object’s path through the air will be wrong and you’ll see this. You probably wont be consciously looking at details like this but in the background your brain will, when things are wrong you’ll notice. In order to create a game which acts as it would in real life you need to create a convincing physical simulation.

There is a lot more to it than this of course, but these are just a few of the things which need to be considered if you wish to create such a simulation.


How Does This Affect Games?

Making objects behave “properly” is all very good but you’d be correct to ask why gamers should care.
The answer can be summed up in one word: BOOM!

When things blow up the explosion and the bits thrown out by it are all subject to the same laws I’ve talked about above. The bits will also bash into one another and interact with the surfaces they impact on, all these interactions will also be governed by physical laws.

Physics will directly impact the look of a game as explosions will now look real, pre-scripted explosions will stand out and look tame and unrealistic by comparison.

Once you’ve seen physics done properly in a game, other games will seem restrictive and inaccurate.
The difference will not be as obvious as the difference between say, different types of graphics rendering, screen shots can show that. No, you will need to see the actual game running to see the difference extensive physics support will make.


Isn’t All This Realism Going To Make Games All The Same?

Movies are usually made with real people doing real things in the real world, this is true even if they are pretending to be in an unreal world. While there is a much greater emphasis on CGI (Computer Generated Imagery) these days a lot of the action shots are still done with real items or at least real models (Lord of the Rings trilogy, The Matrix trilogy and War of the Worlds all did this to one degree or another). Even in fantasy or SciFi movies you are seeing very real physics in action. This physical realism does not hurt the movies, if anything it does the opposite. I don’t expect it to be any different in games.

Physical simulation does however have the advantage that it can be programmed so that it simulates something other than this planet. If you create a game which is set on the moon you can create a simulation of the moon complete with the lower gravity. Moon Olympics anyone?

But that’s not all you can do, there’s nothing to stop you using the simulation to create physics which are decidedly unreal. You could for instance give your baddies extreme properties, opponents which are made out of gas, water, metal, liquid metal (think terminator 2), fire, and who knows what else. This could make for some interesting, challenging games. How exactly do you kill a ghost if they look and act if they act like a gas? Bullets wont exactly do much damage. So, now you can write Moon Olympics with ghosts.

There’s nothing to stop you going completely mad and having composite properties which would not or could not exist in the real world. A baddie made out of neutron star material would pose something of a difficulty to defeat, but then this could be part of a game set in the metallic hydrogen core of Jupiter where you may just be able swim around and find an appropriate weapon. Hmm, sounds like there’s a Dr Who game in there somewhere...

So, even if everything acting “real” doesn’t mean game will all be the same, quite the contrary in fact. Physical simulation opens doors to all sorts of new game possibilities.


How It Works: The Physics Simulation Software

The PhysX [PhysX] chip itself is pretty useless without software to run, that’s the job of the physics simulation software.

The simulation is broadly divided into four parts:

1.
Host Interface

2.
Collision Detection

3.
Force Computation

4.
Dynamics Simulation


Host Interface
This part communicates with the game engine. If you happen to drive a tank through a wall the physics simulation can figure out what happens to the wall. But, in order for this to happen the simulation has to be told where you are moving the tank.

The interface goes well beyond just moving objects around though, the game engine can change the properties of the simulation in real time. If you have action on a space station and gravity gets switched off the engine will tell the simulation about this and the simulation will change accordingly.

The host interface also works the other way around as often the simulation will trigger an event and need to tell the game engine about this. An example would be when an object hits your player, the simulation tells the game engine and it deducts a few life points.

Collision Detection
When your tank drives into the wall it collides with the bricks its made from. In this case the force of the tank colliding with the wall will cause the bricks to move, collision detection works out which bricks are affected.

This is a very complex process as all manner of calculations are involved and you have to compare every object against every other object. In reality though this requires far to much computation so various techniques are used to reduce the overhead. One method breaks the scene up into different zones and only compares parts within them. Another method is to use less accurate “coarse” collision detection to see if there is a collision then use more compute intensive “fine” detection to work out the exactly what is hitting what, where.

Force Computation
When the tank hits the wall some bricks will be pushed out of the way. Collision detection tells you which ones are affected but now you need to work out the exact details of each collision. You need to calculate the forces acting on each brick, these will differ as they depend on how each brick is being hit and where. These forces will include the force of the tank hitting it, gravity and the force of bricks above it pushing downwards. Other bricks may not be hit directly but will nevertheless be affected. Bricks above the ones being hit will now have nothing to hold them up so they will fall.

Dynamics Simulation
When something moving collides with something else its speed and path will change from what it was doing before. How fast they move and what direction they go in is calculated by working out how the forces on the object interact. Even if objects do not collide there will still be changes due to wind resistance and gravity. Dynamics simulation handles all this.

Working Together
All these parts need to work together to give a correctly simulation.
If a brick happens to hit your player while driving through the wall you need to calculate what the brick and your player does as a result. Since your player was involved the game engine will need to be notified so some health points can be deducted. In such a case all the parts of the simulation are involved.

Complexity
A real simulation is much more complex that the system I’ve described as many different factors need to be considered in order to be realistic. All of these require lots of mathematical operations, to ensure the movement is smooth these calculations will often be done at twice the frame rate of the game.

Simplicity
In some ways adding a complex physics simulation simplifies things in other areas.

For example, the simulation can include gravity and wind resistance but they can also be ignored for game set in space. Since gravity is a force which acts on everything everything will be affected and act accordingly. If you blow up something on the ground the parts will go flying everywhere but will slow down due to wind resistance and fall back to earth due to gravity. If you blow up something in space where neither of these forces are present the parts are going to go everywhere and not stop.

Having both these environments in a game could involve a lot of different code for each environment, the physics simulation on the other hand can simulate both by including (or not) a pair of “force objects”.

The simulation can also be changed on the fly to allow all sorts of interesting effects. A racing game may accurately model a car when racing but when a crash occurs the parameters could be changed on the fly to make a crash look more spectacular [Car] .

Visible objects have physical properties but it’s also quite possible to have invisible objects with properties, this allows things like invisible force fields to be added to a game.
The simulation can also be used to perform non-physically related tasks, you could place an invisible object which has no properties in a game and use it as a trigger. When the collision detection indicates the object has been touched the game engine can trigger an action such as having new baddies appear. You could use this sort of arrangement at a corner in a game to simulate a guard watching out for you.


Physical Hardware - Why?

Physical simulation already exists in games but you need a lot horsepower to do it properly. It requires massive floating point capabilities and collision detection in particular requires potentially massive memory bandwidth. General purpose CPUs have neither of these and are consequently not very good at these simulations. It is for physics and other similar tasks that the processors in next generation games consoles such as the XBox 360 and PS3 have high bandwidth and huge floating point capabilities.

For the PC Ageia have created a custom processor dedicated to handling physics for much the same reason. According to Ageia, current dual core processors can handle around 1,000 “ridged bodies” whereas the PhysX Physics Processor unit (PPU) can handle up to 32,000.

For a convincing physical simulation a game could have thousands of objects on screen at any one time. Right now there is simply no way of doing this without dedicated hardware.

Templates Designing, Web Animations, Fonts & more

Instant access to:

* 1000 Amazing HTML/Photoshop Website Templates
* 5000 Web Animations / Clipart Images
* 1000 Exclusive Fonts
* Dreamweaver / FrontPage / Text Editor Ready
* Professionally designed with diverse themes
* One-time fee provides unlimited access for 3 Years!


Benefits:

* Professionally designed web templates
* Download the source files for all the featured templates
* Access to numerous exquisitely designed website template layouts
* Access to PSD files in addition to the HTML files for the web templates
* Web templates are updated with new additions on a regular basis
* Instantly downloadable upon account verification
* Easily upload able to your web servers
* Very easy to incorporate web content like text matter, images etc


Digital Science Web Technologies offers you a comprehensive website templates gallery consisting of HTML / Photoshop web templates. Our website templates are providing easy-to-edit. All the featured website templates have been designed to cater to your business needs both in terms of quality as well as cost. Our website templates are available in various categories which makes our gallery one of the most comprehensive and professionally designed collections available on the web today. You will also have access to the PSD files used to design the website templates displayed in the gallery. You can edit the PSD files and the HTML files using any sort of HTML editor and to edit your image files you will need any image editor that support PSD files such as Photoshop, Image Ready or Paint Shop Pro.

What is a Website Design Template?

Web site templates are pre-made web designs which can be easily customized to reflect your company's branding. We promise to make it 100% fit your needs. Templates are available in Photoshop and HTML format.
A Design template can be thought of as a pre-made Website. A Design Template is a pre-made layout for your web page. Each template is made to be easily customized by anyone with some web design knowledge. Each template includes fully coded html index and content pages, as well as a set of blank images and Photoshop files to make your customization job as easy as possible.

How To Keep Your Extreme Commuters Happy

A few weeks ago, I asked several BNET readers to track their time. The logs are starting to come in, and while everyone has his own challenges, one common complaint is a too-long commute.
BNET reader Amanda Bledsoe wrote that “I always thought of myself as pretty organized, but since taking a new job last summer that has me commuting 2 hours a day, I’m really struggling.” That hour in the morning was time she’d prefer to be exercising, and the hour in the evening sapped her energy for anything active or creative at night. She’d come up with a few good solutions — brushing up on her Spanish vocabulary with audiobooks, carpooling, and working from home once a week — but 8 hours in the car per week is still a lot. How could she use it?
Likewise, BNET reader Aprill Nelson’s time logs showed commutes that ranged from 30 minutes to over an hour. She wanted to “spend more time learning about different topics in my industry, which would help me become more productive.” We agreed that her commute would be a great time to do that, but with both of us hunting around, we were coming up short on audio books or courses dealing with petroleum engineering (she’s in the oil & gas business).
So what to do? Long-term, we need a work culture that allows for more remote work. We probably need more mass transit, too, not only to lessen traffic, but to give commuters a chance to read or relax rather than battling other drivers.
But in the meantime, this suggests a great opportunity for managers or HR departments to help extreme commuters fill this dead time. Why not create audio training materials dealing with your specific company or industry to help commuters use that time to advance in their careers? There are plenty of broad offerings out there (audio books on business topics, or survey lectures from companies like The Great Courses) but not much that is specific, frequently updated, and as easy to find as shock jocks on the radio. As Nelson put it, “I am considering starting a business that provides daily audio updates and learning materials for professionals.” Seems like there’s a big hole in the market here.

How Computer Viruses Work

Strange as it may sound, the computer virus is something of an Information Age marvel. On one hand, viruses show us how vulnerable we are -- a properly engineered virus can have a devastating effect, disrupting productivity and doing billions of dollars in damages. On the other hand, they show us how sophisticated and interconnected human beings have become.
For example, experts estimate that the Mydoom worm infected approximately a quarter-million computers in a single day in January 2004. Back in March 1999, the Melissa virus was so powerful that it forced Microsoft and a number of other very large companies to completely turn off their e-mail systems until the virus could be contained. The ILOVEYOU virus in 2000 had a similarly devastating effect. In January 2007, a worm called Storm appeared -- by October, experts believed up to 50 million computers were infected. That's pretty impressive when you consider that many viruses are incredibly simple.
When you listen to the news, you hear about many different forms of electronic infection. The most common are:

* Viruses - A virus is a small piece of software that piggybacks on real programs. For example, a virus might attach itself to a program such as a spreadsheet program. Each time the spreadsheet program runs, the virus runs, too, and it has the chance to reproduce (by attaching to other programs) or wreak havoc.
* E-mail viruses - An e-mail virus travels as an attachment to e-mail messages, and usually replicates itself by automatically mailing itself to dozens of people in the victim's e-mail address book. Some e-mail viruses don't even require a double-click -- they launch when you view the infected message in the preview pane of your e-mail software [source: Johnson].
* Trojan horses - A Trojan horse is simply a computer program. The program claims to do one thing (it may claim to be a game) but instead does damage when you run it (it may erase your hard disk). Trojan horses have no way to replicate automatically.
* Worms - A worm is a small piece of software that uses computer networks and security holes to replicate itself. A copy of the worm scans the network for another machine that has a specific security hole. It copies itself to the new machine using the security hole, and then starts replicating from there, as well.

­In this article, we will discuss viruses -- both "traditional" viruses and e-mail viruses -- so that you can learn how they work and understand how to protect yourself.

Twitter Delicious Facebook Digg Stumbleupon Favorites More

 
Design by Free WordPress Themes | Bloggerized by Lasantha - Premium Blogger Themes | belt buckles